Skip to main content Skip to Footer

Job Listing

Data Platform - Hadoop Developer

Job Location: Kuala Lumpur

Regional Description: Malaysia

Job Number: 00550375


- Job description

The digital revolution is changing everything. It’s everywhere – transforming how we work and play. Are you reacting to the disruption each day or are you leading the way as a digital disrupter? Accenture Digital is driving these exciting changes and bringing them to life across 40 industries in more than 120 countries. At the forefront of digital, you’ll create it, own it and make it a reality for clients looking to better serve their connected customers and operate always-on enterprises. Join us and become an integral part of our experienced digital team with the credibility, expertise and insight clients depend on.
Analytics, part of Accenture Digital, help our clients grow their business in entirely innovative ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition.
As part of our Analytics practice, you will join a worldwide network of over 13,000 smart and driven colleagues experienced in leading statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking managers provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Accenture Digital Analytics team is currently looking for a: -
Data Platform - Hadoop Developer
  • Design and implement data management and/or architecture solutions for Hadoop 
  • Design and develop data integration and processing components on Hadoop:
    1. Extract and load processes, e.g. Sqoop, Flume
    2. Data processing and transformation on Hadoop - Spark, MapReduce
    3. Streaming / real time data load to Hadoop - e.g. Kafka, Flume
  • Support security and metadata implementation on the Hadoop solution
  • Advise on detailed technical design for Hadoop components, (e.g. HDFS file compression and directory structure)
  • Design and run unit tests.
  • Perform bug diagnosis and fix.
  • Migrate code between development and test environments.
  • Participate in support of the Hadoop development environment.
  • 3 + years designing and implementing large scale data loading, manipulation, processing solutions using Hadoop/NoSQL technologies
  • Strong knowledge and proficiency in core HAdoop technologies (e.g. HDFS, Hbase, Hive, MapReduce,
  • Strong proficiency in Spark development, at least three projects experience.
  • Proficiency in Hadoop integration packages, Sqoop, Flume.
  • Proficiency in NoSQL, e.g. Cassandra, MongoDB etc.
  • Proficiency in streaming integration development e.g. Kakfa
  • Cloud development experience (e.g. AWS, Azure)
  • Undergraduate degree at minimum in Comp Science/ Information Systems.
  • Oil and gas downstream experience or related experience in chemicals or industrial projects is beneficial.
Professional Skills:
  • Eagerness to contribute in a team-oriented environment
  • Ability to work creatively and analytically in a problem-solving environment
  • Desire to work in an information systems environment
  • Excellent leadership, communication (written and oral) and interpersonal skills

Find a job

Start your search here: Enter job title, skill, experience level or city

Stay in touch

Join the Accenture Talent Connection, follow our Careers blog, or sign up for job alerts.