Skip to main content Skip to footer

Description Du Poste

Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Qualifications

 

  • Minimum 1 year of experience is required
  • Develop high-quality, scalable ETL/ELT pipelines using Databricks technologies including Delta Lake, Auto Loader, and DLT.
  • Excellent programming and debugging skills in Python.
  • Strong hands-on experience with PySpark to build efficient data transformation and validation logic.
  • Must be proficient in at least one cloud platform: AWS, GCP, or Azure.
  • Create modular dbx functions for transformation, PII masking, and validation logic — reusable across DLT and notebook pipelines.
  • Implement ingestion patterns using Auto Loader with checkpointing and schema evolution for structured and semi-structured data.
  • Build secure and observable DLT pipelines with DLT Expectations, supporting Bronze/Silver/Gold medallion layering.
  • Configure Unity Catalog: set up catalogs, schemas, user/group access, enable audit logging, and define masking for PII fields.
  • Enable secure data access across domains and workspaces via Unity Catalog External Locations, Volumes, and lineage tracking.
  • Access and utilize data assets from the Databricks Marketplace to support enrichment, model training, or benchmarking.
  • Collaborate with data sharing stakeholders to implement Delta Sharing — both internally and externally.
  • Integrate Power BI/Tableau/Looker with Databricks using optimized connectors (ODBC/JDBC) and Unity Catalog security controls.
  • Build stakeholder-facing SQL Dashboards within Databricks to monitor KPIs, data pipeline health, and operational SLAs.
  • Prepare GenAI-compatible datasets: manage vector embeddings, index with Databricks Vector Search, and use Feature Store with MLflow.
  • Package and deploy pipelines using Databricks Asset Bundles through CI/CD pipelines in GitHub or GitLab.
  • Troubleshoot, tune, and optimize jobs using Photon engine and serverless compute, ensuring cost efficiency and SLA reliability.
  • Experience with cloud-based services relevant to data engineering, data storage, data processing, data warehousing, real-time streaming, and serverless computing.
  • Hands on Experience in applying Performance optimization technique
  • Understanding data modeling and data warehousing principles is essential

Good to Have:

  • Certifications: Databricks Certified Professional or similar certifications.
  • Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries.
  • Knowledge of big data processing (e.g., Spark, Hadoop, Hive,Kafka)
  • Data Orchestration: Apache Airflow.
  • Knowledge of CI/CD pipelines and DevOps practices in a cloud environment.
  • Experience with ETL tools like Informatica, Talend, Matillion, or Fivetran.
  • Familiarity with dbt (Data Build Tool)

 

#LI-PH

La vie chez Accenture

Bureau et environnement de travail

Que ce soit pour travailler virtuellement ou sur place, nos espaces multifonctions appuient l’innovation, la créativité et la collaboration.

Formation et perfectionnement

Prenez le temps d’apprendre dans nos centres régionaux, nos salles de classe connectées, nos cours en ligne et nos tableaux d’apprentissage.

Renseignez-vous davantage au sujet d’Accenture

Notre expertise

Voyez comment nous maîtrisons la puissance du changement pour créer une valeur et un succès commun pour nos clients, notre personnel, nos actionnaires, nos partenaires et nos collectivités.

Rencontrez nos gens

Du premier échelon à la direction, dans tous les secteurs commerciaux et industriels, faites connaissance de nos gens qui maîtrisent la technologie pour faire une différence, tous les jours.

Restez connecté

Joignez-vous à notre équipe

Recherchez parmi des postes qui vous conviennent. Nous cherchons des joueurs d’équipe passionnés, curieux, créatifs et orientés solutions.

Restez informé

Des astuces carrière, des perspectives d’initiés et des analyses d’industrie qui sont d’actualité et proposés par des gens qui travaillent ici.