Skip to main content Skip to Footer

Job Listing



Big Data Engineering Architect Consultant

Job Location: DC, DC - Washington, FL - Orlando, GA - Atlanta, IL - Chicago, MA - Boston, MD - Baltimore, MN - Minneapolis, NC - Charlotte, NY - New York, PA - Philadelphia, TX - Dallas, TX - Houston, VA - Richmond, WA - Seattle

Regional Description: Midwest

Job Number: 00529120

APPLY GET REFERRED SAVE THIS JOB

- Job description

Schedule: Full-time
Organization:  Analytics Business
Travel: 100% (Monday - Friday)
Position: Analytics Consulting – Big Data Engineering Architect Consultant

The digital revolution is changing everything. It’s everywhere – transforming how we work and play. Are you reacting to the disruption each day or are you leading the way as a digital disrupter? Accenture Digital is driving these exciting changes and bringing them to life across 40 industries in more than 120 countries. At the forefront of digital, you’ll create it, own it and make it a reality for clients looking to better serve their connected customers and operate always-on enterprises. Join us and become an integral part of our experienced digital team with the credibility, expertise and insight clients depend on.
Accenture Digital is powered by three practices –Mobility, Interactive, and Analytics. As part of our Analytics practice, you’ll deliver analytically-informed, issue-based solutions that help clients make faster, smarter decisions. You’ll play a critical role in helping them tackle complex business issues. 


JOB DESCRIPTION

Do you have a pulse on new technologies and a desire to change the way business gets done? Do you want to implement emerging solutions for some of the most successful companies around? If you answered yes to these questions and you are passionate about helping clients effectively manage enormous amounts of data to generate knowledge and value, then we want to meet you.

 

Data Engineering Architects at the Senior Manager level will be responsible for providing solutions to digital age data challenges in multiple industries through architecture, design and implementation of an ecosystem of Hadoop, NoSQL, Self-Service Data Preparation, AI, Machine Learning and other modern data technologies on premise and on the Cloud. The solutions typically will include re-engineering the data acquisition, storage, processing, security, data management, governance and analysis using these technologies leading to the implementation of a modern data platform. A solid experience and understanding of considerations for planning, scaling, deployment and operations that are unique to Hadoop, NoSQL and other emerging modern data technologies is required. We are looking for candidates who have a broad set of skills across these areas and who can demonstrate an ability to identify and apply modern data and cloud technologies to architect and implement innovative and differentiated data solutions. Our architects are expected to work with our clients in defining the future state architectures of their data platforms enabling analytical intelligence and then play a lead role to implement a roadmap to reach that state. 

Basic Qualifications

  • Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience
  • Minimum 1 year of architecting, implementing and successfully operationalizing large scale data solutions in production environments using Hadoop and NoSQL ecosystem on premise or on Cloud (AWS, Google or Azure) using many of the relevant technologies such as Nifi, Spark, Kafka, HBase, Hive, Cassandra, EMR, Kinesis, BigQuery, DataProc, Azure Data Lake etc.
  • Minimum 1 year of architecting data and buildconing performant data models at scale for Hadoop/NoSQL ecosystem of data stores to support different business consumption patterns off a centralized data platform
  • Minimum 1 year of Spark/MR/ETL processing, including Java, Python, Scala, Talend; for data analysis of production Big Data applications
  • Minimum 1 year of architecting and industrializing data lakes or real-time platforms for an enterprise enabling business applications and usage at scale
  • Minimum 2 years designing and implementing relational data models working with RDBMS and understanding of the challenges in these environment  

Preferred Skills

  • Minimum 1 year of experience implementing SQL on Hadoop solutions using tools like Presto, AtScale and others
  • Minimum 1 year of experience implementing large scale BI/Visualization solutions on Big Data platforms
  • Minimum 1 year of experience implementing large scale secure cloud data solutions using AWS data and analytics services e.g. S3, EMR, Redshift
  • Minimum 1 year of experience implementing large scale secure cloud data solutions using Google data and analytics services e.g. BigQuery, DataProc
  • Minimum 1 year of experience building data management (metadata, lineage, tracking etc.)  and governance solutions for modern data platforms that use Hadoop and NoSQL on premise or on AWS, Google and Azure cloud
  • Minimum 1 year of experience securing Hadoop/NoSQL based modern data platforms on-premise or on AWS, Google, Azure cloud
  • Minimum 1 year of Re-architecting and rationalizing traditional data warehouses with Hadoop or NoSQL technologies on premise or transition to AWS, Google clouds
  • Experience implementing data wrangling and data blending solutions for enabling self-service solutions using tools such as Trifacta, Paxata
  • 1 year industry systems development and implementation experience OR Minimum of 2 years of data loading, acquisition, storage, transformation, and analysis
  • Minimum 1 years of using Talend, Informatica like ETL tools within a Big Data environment to perform large scale metadata integrated data transformation
  • Minimum 1 year of building Business Catalogs or Data Marketplaces on top of a Hybrid data platform containing Big Data technologies
  • Responsibilities include the following:

Architect modern data solutions in a hybrid environment of traditional and modern data technologies such as Hadoop, NoSQL

  • Create technical and operational architectures for these solutions incorporating Hadoop, NoSQL and other modern data technologies
  • Implement and deploy custom solutions/applications using Hadoop/NoSQL
  • Lead and guide implementation teams and provide technical subject matter expertise in support of the following:
    • Designing, implementing and deploying ETL to load data into Hadoop/NoSQL
    • Security implementation of a Hadoop/NoSQL solutions
    • Managing data in Hadoop/NoSQL co-existing with traditional data technologies in a hybrid environment
    • Troubleshooting production issues with Hadoop/NoSQL
    • Performance tuning of a Hadoop/NoSQL environment
  • Architecting and implementing metadata management solutions around Hadoop and NoSQL in a hybrid environment
     
     
  •  

    Find a job

    Start your search here: Enter job title, skill, experience level or city

    Stay in touch

    Join the Accenture Talent Connection, follow our Careers blog, or sign up for job alerts.