This job is no longer available. Continue your job search here.
Data Engineer
Indore
Job No. atci-4617207-s1792230
Full-time
Job Description
Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : GCP Dataflow
Good to have skills : Google Cloud DevOps Services, Google BigQuery, Google Cloud Platform Architecture
Minimum 7.5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: A Data Engineer is a developer responsible for working with the Data Warehouse team to design, build and maintain new data warehouse hosted in Google Cloud for our independence initiative. This work includes all aspects of developing ETUELT data pipelines for the data warehouse. You will work closely with other smart data engineers that enjoy being productive and making a difference in a friendly and pragmatic environment in which each person knows that part of their job is to make other's jobs easier Roles & Responsibilities: - Design and build ETL/ELT pipelines using Composer (Airflow), Java Dataflow (Beam) and other technologies on the Google Cloud - Analyze source data and work with internal data consumers to determine which data is needed and how it should be represented in the output table schemas - Write and maintain related technical documentation (eg for data engineers, data analysts, security engineers, etc) - Participate in investigating technical options and conceiving of best practices for the data warehouse and related ETL/ETL pipelines - Design and develop data solutions for data generation, collection, and processing. - Create data pipelines to migrate and deploy data across systems. - Ensure data quality and integrity throughout the data solutions. - Implement ETL (extract, transform, and load) processes. - Collaborate with cross-functional teams to gather requirements and understand data needs. - Optimize data solutions for performance and scalability. - Troubleshoot and resolve data-related issues. - Stay up-to-date with the latest trends and technologies in data engineering. Professional & Technical Skills: - Must have ETL/ELT for data warehousing including data source investigation/analysis, target schema design anddata pipeline design/implementation - Must have ETL/ELT to BigQuery, Java programming, Apache Beam, preferably in Google Cloud Dataflow, preferably in Java, Google Cloud Composer in Python. - Google Cloud Cortex - Apache Airflow in Python - SQL scripting Additional Information: - The candidate should have a minimum of 10 years of experience in GCP Dataflow. - This position is based at our Indore office. - A 15 years full-time education is required. - Solid interpersonal skills - Able to write clear, concise and well-reasoned technical explanations and documentation (in English)
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : GCP Dataflow
Good to have skills : Google Cloud DevOps Services, Google BigQuery, Google Cloud Platform Architecture
Minimum 7.5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: A Data Engineer is a developer responsible for working with the Data Warehouse team to design, build and maintain new data warehouse hosted in Google Cloud for our independence initiative. This work includes all aspects of developing ETUELT data pipelines for the data warehouse. You will work closely with other smart data engineers that enjoy being productive and making a difference in a friendly and pragmatic environment in which each person knows that part of their job is to make other's jobs easier Roles & Responsibilities: - Design and build ETL/ELT pipelines using Composer (Airflow), Java Dataflow (Beam) and other technologies on the Google Cloud - Analyze source data and work with internal data consumers to determine which data is needed and how it should be represented in the output table schemas - Write and maintain related technical documentation (eg for data engineers, data analysts, security engineers, etc) - Participate in investigating technical options and conceiving of best practices for the data warehouse and related ETL/ETL pipelines - Design and develop data solutions for data generation, collection, and processing. - Create data pipelines to migrate and deploy data across systems. - Ensure data quality and integrity throughout the data solutions. - Implement ETL (extract, transform, and load) processes. - Collaborate with cross-functional teams to gather requirements and understand data needs. - Optimize data solutions for performance and scalability. - Troubleshoot and resolve data-related issues. - Stay up-to-date with the latest trends and technologies in data engineering. Professional & Technical Skills: - Must have ETL/ELT for data warehousing including data source investigation/analysis, target schema design anddata pipeline design/implementation - Must have ETL/ELT to BigQuery, Java programming, Apache Beam, preferably in Google Cloud Dataflow, preferably in Java, Google Cloud Composer in Python. - Google Cloud Cortex - Apache Airflow in Python - SQL scripting Additional Information: - The candidate should have a minimum of 10 years of experience in GCP Dataflow. - This position is based at our Indore office. - A 15 years full-time education is required. - Solid interpersonal skills - Able to write clear, concise and well-reasoned technical explanations and documentation (in English)
Qualifications
15 years full time education
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.