This job is no longer available. Continue your job search here.
Data Engineer
Bengaluru
Job No. atci-4633311-s1786428
Full-time
Job Description
Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : AWS Architecture
Good to have skills : Google BigQuery
Minimum 3 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: Build data pipelines on AWS and expecting knowledge on S3's Redshift, and replace pipelines built on GCP’s Big query, Google bucket and air flows Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and develop data solutions for data generation, collection, and processing. - Create and maintain data pipelines to ensure efficient data flow. - Implement ETL processes to migrate and deploy data across systems. - Collaborate with cross-functional teams to understand data requirements and provide technical expertise. - Ensure data quality and integrity by implementing data validation and cleansing techniques. - Optimize data storage and retrieval for performance and scalability. - Monitor and troubleshoot data pipelines and ETL processes to identify and resolve issues. - Stay updated with industry trends and best practices in data engineering. - Contribute to the documentation of data solutions and processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture. - Good To Have Skills: Experience with Google BigQuery. - Strong understanding of data engineering principles and best practices. - Experience in designing and implementing scalable data solutions. - Proficient in SQL and scripting languages for data manipulation and transformation. - Familiarity with data warehousing concepts and technologies. - Knowledge of cloud platforms and services, such as AWS or Azure. - Experience with data integration and ETL tools, such as Apache Spark or Informatica. Additional Information: - The candidate should have a minimum of 3 years of experience in AWS Architecture. - This position is based at our Bengaluru office. - A 15 years full-time education is required.
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : AWS Architecture
Good to have skills : Google BigQuery
Minimum 3 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: Build data pipelines on AWS and expecting knowledge on S3's Redshift, and replace pipelines built on GCP’s Big query, Google bucket and air flows Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and develop data solutions for data generation, collection, and processing. - Create and maintain data pipelines to ensure efficient data flow. - Implement ETL processes to migrate and deploy data across systems. - Collaborate with cross-functional teams to understand data requirements and provide technical expertise. - Ensure data quality and integrity by implementing data validation and cleansing techniques. - Optimize data storage and retrieval for performance and scalability. - Monitor and troubleshoot data pipelines and ETL processes to identify and resolve issues. - Stay updated with industry trends and best practices in data engineering. - Contribute to the documentation of data solutions and processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture. - Good To Have Skills: Experience with Google BigQuery. - Strong understanding of data engineering principles and best practices. - Experience in designing and implementing scalable data solutions. - Proficient in SQL and scripting languages for data manipulation and transformation. - Familiarity with data warehousing concepts and technologies. - Knowledge of cloud platforms and services, such as AWS or Azure. - Experience with data integration and ETL tools, such as Apache Spark or Informatica. Additional Information: - The candidate should have a minimum of 3 years of experience in AWS Architecture. - This position is based at our Bengaluru office. - A 15 years full-time education is required.
Qualifications
15 years full time education
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.