Skip to main content Skip to Footer

Job Listing



Google Cloud Data Engineering Consultant

Job Location: Quebec - Montreal

Regional Description: Canada

Job Number: 00588013

APPLY SAVE THIS JOB

- Job description

Responsibilities: 


•Work with data team to efficiently use Hadoop/Cloud infrastructure to analyse data, build models, and generate reports/visualizations 

•Create and optimized distributed algorithms to scale out recommendation engine 

•Integrate massive datasets from multiple data sources for data modelling 

•Implement methods for automation of all parts of the predictive pipeline to minimize labour in development and production 

•Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management 

•Knowledge in machine learning algorithms especially in recommender systems

•Extracting, Loading, Transforming, cleaning, and validating data

•Designing pipelines and architectures for data processing

•Creating and maintaining machine learning and statistical models

•Querying datasets, visualizing query results and creating reports

Experience: 5 - 8 years 


Skills: 

•5+ years of experience in data engineering & data management systems for big data environment in the Cloud (AWS/GCP).

•Must have at least 2+ years of experience using Google Cloud services like Dataflow (python), BigQuery, DataProc, Cloud Storage, Cloud Spanner, Cloud Repository and App Engine etc. OR Amazon Cloud services like EC2, EMR, Kinesis, RDS, Redshift etc.

•5+ year experience in writing complex SQL queries, data aggregation and transformation. 

•Experience using Big Query is a plus. 

•API integration experience using REST API, JSON, Node.js and Python is a plus. 

•Experience with Continuous Integrations and delivery environments like Jenkins Experience in Agile Development Methodology Domain knowledge in digital marketing, sales is a plus. 

•Knowledge in Apache Spark, Writing MapReduce jobs, Writing R scripts is a plus. Google Cloud Platform certification


Desired Experience: 

•BS, MS in Computer Science/Engineering or Electrical Engineering 

•5+ years of working experience in specifically Hadoop Data Engineering 

•Experience in architecture in the field of big data specific to Hadoop

•Completed GCP Data engineering course, certification is preferred  

•Strong grasp of one or more programming languages such as Java, Scala, Python etc. 

•Strong working experience in SQL, Messaging Queue concepts, real time and batch pipeline workloads, High availability,

•Good understanding of algorithms and data structures 

•Strong understanding of multi-threading and resource management 

•Good understanding of DevOps and agile methodologies

•Experience in Cloud data migration projects

•Experience in writing optimized MapReduce code

•Experience with Google Cloud Technology Stack – Pub/Sub, DataFlow, Big Query, Big Table, Apache Beam, etc 

•Experience working with large datasets in Hadoop (using scripts and tools - NoSQL/Hive etc.) 

•Experience working with Hadoop and its ecosystem of products

•Experience working cloud environments like GCP, AWS, MS Azure 

•Familiar with machine learning algorithms, including regression, GBM, recommendation systems, clustering, etc. 

•Understanding security of the cloud and Hadoop eco system

•Application Containerization using Docker or Kubernetes 

•Working experience in Data Modelling, ETL tools, Data API, Relation Database like MySQL 

•Good understanding of Cloud application and Infrastructure architecture 

•Strong analytics and problem-solving capabilities combined with ambition to solve real-world problems 

•Strong verbal and written communication skills

Find a job

Start your search here: Enter job title, skill, experience level or city

Stay in touch

Join the Accenture Talent Connection, follow our Careers blog, or sign up for job alerts.