This job is no longer available. Continue your job search here.
Data Engineer
Pune
Job No. atci-4662559-s1805257
Full-time
Job Description
Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : Google Cloud Data Services, GCP Dataflow, Apache Airflow, Python (Programming Language)
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will be responsible for designing and implementing data solutions that meet the needs of the organization and contribute to its overall success. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Design and develop data pipelines to extract, transform, and load data. - Ensure data quality and integrity throughout the data processing lifecycle. - Implement ETL processes to migrate and deploy data across systems. - Collaborate with cross-functional teams to understand data requirements and design appropriate solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Data Services, GCP Dataflow, Apache Airflow, Python (Programming Language). - Must Have GCP Cloud data implementation projects (DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, etc.). - Nice to Have – Scala experience. • Strong of experience with one of the leading public clouds. • Strong of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading. • Strong experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics. • Mandatory Experience: years of experience with Python with working knowledge on Notebooks. • Mandatory - years working on a cloud data projects - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in Google Cloud Data Services. - This position is based at our Pune office. - A 15 years full time education is required.
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : Google Cloud Data Services, GCP Dataflow, Apache Airflow, Python (Programming Language)
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will be responsible for designing and implementing data solutions that meet the needs of the organization and contribute to its overall success. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Design and develop data pipelines to extract, transform, and load data. - Ensure data quality and integrity throughout the data processing lifecycle. - Implement ETL processes to migrate and deploy data across systems. - Collaborate with cross-functional teams to understand data requirements and design appropriate solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Data Services, GCP Dataflow, Apache Airflow, Python (Programming Language). - Must Have GCP Cloud data implementation projects (DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, etc.). - Nice to Have – Scala experience. • Strong of experience with one of the leading public clouds. • Strong of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading. • Strong experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics. • Mandatory Experience: years of experience with Python with working knowledge on Notebooks. • Mandatory - years working on a cloud data projects - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in Google Cloud Data Services. - This position is based at our Pune office. - A 15 years full time education is required.
Qualifications
15 years full time education
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.