This job is no longer available. Continue your job search here.
Application Developer
Hyderabad
Job No. atci-4424180-s1723518
Full-time
Job Description
Project Role : Application Developer
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : PySpark
Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Amazon Web Services (AWS)
Minimum 5 year(s) of experience is required
Educational Qualification : Any Graduation
Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using PySpark. Your typical day will involve working with PySpark, Oracle Procedural Language Extensions to SQL (PLSQL), and Amazon Web Services (AWS) to develop and maintain applications that meet business needs. Key Responsibilities: • Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions • Build and operate very large data warehouses or data lakes. • ETL optimization, designing, coding, & tuning big data processes using Apache Spark. • Build data pipelines & applications to stream and process datasets at low latencies. • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: • Minimum of 1 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark • Minimum of 3 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. • Minimum 2 year of Experience in one or more programming languages Python, Java, Scala • Experience using airflow for the data pipelines in min 1 project • 1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Resource is willing to work in B shift. - This position is based at our Hyderabad office.
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : PySpark
Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Amazon Web Services (AWS)
Minimum 5 year(s) of experience is required
Educational Qualification : Any Graduation
Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using PySpark. Your typical day will involve working with PySpark, Oracle Procedural Language Extensions to SQL (PLSQL), and Amazon Web Services (AWS) to develop and maintain applications that meet business needs. Key Responsibilities: • Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions • Build and operate very large data warehouses or data lakes. • ETL optimization, designing, coding, & tuning big data processes using Apache Spark. • Build data pipelines & applications to stream and process datasets at low latencies. • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: • Minimum of 1 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark • Minimum of 3 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. • Minimum 2 year of Experience in one or more programming languages Python, Java, Scala • Experience using airflow for the data pipelines in min 1 project • 1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Resource is willing to work in B shift. - This position is based at our Hyderabad office.
Qualifications
Any Graduation
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.