This job is no longer available. Continue your job search here.
Application Developer
Hyderabad
Job No. atci-4658656-s1799562
Full-time
Job Description
Project Role : Application Developer
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : Databricks Unified Data Analytics Platform, PySpark, Amazon Web Services (AWS)
Good to have skills : NA
Minimum 7.5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will work with the Databricks Unified Data Analytics Platform and utilize your expertise in Amazon Web Services (AWS) and PySpark. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring the applications meet the required standards and functionality. Key Responsibilities: • Work on client projects to deliver Databricks, PySpark, AWS, Terraform based Data engineering & Analytics solutions. • Build and operate very large data warehouses or delta lakes. • Optimization, designing, coding, & tuning big data processes using Apache Spark. • Build data pipelines & applications to stream and process datasets at low latencies using Airflow. • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: • Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake, provisioning AWS infrastructure using Terraform, Orchestration the job using Airflow. •Minimum of 5 years of experience years of experience in Databricks, AWS, Pyspark, ETL, Big Data/Hadoop and data warehouse architecture & delivery. • Minimum of 2 years of experience years in real time streaming using Kafka/Kinesis. • Minimum 4 year of Experience in one or more programming languages Python. • Experience using airflow for the data pipelines in min 1 year experience project. • 1 years of experience developing CI/CD pipelines using GIT, Jenkins, Shell Scripting, Terraform • Ready to work in C Shift (2PM – 11 PM) • A Client facing skills: solid experience working in client facing environments to build trusted relationships with client stakeholders. • Good critical thinking and problem-solving abilities • Health care knowledge • Good Communication Skills Additional Information: - The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Hyderabad office. - A 15 years full-time education is required.
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : Databricks Unified Data Analytics Platform, PySpark, Amazon Web Services (AWS)
Good to have skills : NA
Minimum 7.5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will work with the Databricks Unified Data Analytics Platform and utilize your expertise in Amazon Web Services (AWS) and PySpark. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring the applications meet the required standards and functionality. Key Responsibilities: • Work on client projects to deliver Databricks, PySpark, AWS, Terraform based Data engineering & Analytics solutions. • Build and operate very large data warehouses or delta lakes. • Optimization, designing, coding, & tuning big data processes using Apache Spark. • Build data pipelines & applications to stream and process datasets at low latencies using Airflow. • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: • Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake, provisioning AWS infrastructure using Terraform, Orchestration the job using Airflow. •Minimum of 5 years of experience years of experience in Databricks, AWS, Pyspark, ETL, Big Data/Hadoop and data warehouse architecture & delivery. • Minimum of 2 years of experience years in real time streaming using Kafka/Kinesis. • Minimum 4 year of Experience in one or more programming languages Python. • Experience using airflow for the data pipelines in min 1 year experience project. • 1 years of experience developing CI/CD pipelines using GIT, Jenkins, Shell Scripting, Terraform • Ready to work in C Shift (2PM – 11 PM) • A Client facing skills: solid experience working in client facing environments to build trusted relationships with client stakeholders. • Good critical thinking and problem-solving abilities • Health care knowledge • Good Communication Skills Additional Information: - The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Hyderabad office. - A 15 years full-time education is required.
Qualifications
15 years full time education
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.