This job is no longer available. Continue your job search here.
Application Developer
Bengaluru
Job No. atci-4588178-s1774768
Full-time
Job Description
Project Role : Application Developer
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : PySpark
Good to have skills : Snowflake Data Warehouse, Denodo Data Virtualization Platform
Minimum 3 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to understand the business needs, designing and developing applications using PySpark, and ensuring the applications meet the required standards and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and develop applications using PySpark. - Collaborate with the team to understand business needs and requirements. - Ensure applications meet the required standards and functionality. - Troubleshoot and debug applications to resolve issues. - Optimize application performance and scalability. - Stay updated with the latest industry trends and technologies. - Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Snowflake Data Warehouse, Denodo Data Virtualization Platform. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in PySpark. - This position is based at our Bengaluru office. - A 15 years full-time education is required.
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : PySpark
Good to have skills : Snowflake Data Warehouse, Denodo Data Virtualization Platform
Minimum 3 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to understand the business needs, designing and developing applications using PySpark, and ensuring the applications meet the required standards and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and develop applications using PySpark. - Collaborate with the team to understand business needs and requirements. - Ensure applications meet the required standards and functionality. - Troubleshoot and debug applications to resolve issues. - Optimize application performance and scalability. - Stay updated with the latest industry trends and technologies. - Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Snowflake Data Warehouse, Denodo Data Virtualization Platform. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in PySpark. - This position is based at our Bengaluru office. - A 15 years full-time education is required.
Qualifications
15 years full time education
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.