Project Role : Data Architect
Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills : Data Architecture Principles
Good to have skills : NA
Minimum 15 year(s) of experience is required
Educational Qualification : No specific Certifications
As a Data & AI delivery lead you will be responsible for managing and leading data and AI projects from inception to completion. You will work closely with cross-functional teams, including data scientists, data engineers, software developers, and business stakeholders, to ensure the successful delivery of data and AI solutions that drive business value. Your expertise in data and AI, combined with strong project management skills, will be essential in achieving project goals. You will also be playing key role within an organization responsible for designing, implementing, and managing the data architecture and infrastructure. you will help provide solutions for migration and modernization programs across all layers from ingestion to consumption using your experience in these areas.
Solutioning: You will be involved in providing solutions for clients problems on their data strategy. You will bring your experience and expertise of delivery and architecture principles that can be implemented by clients.
Program Management: Lead end-to-end project management for data and AI initiatives, including project planning, resource allocation, risk management, and timeline tracking.
Technical Oversight: Ensure that data and AI solutions are built using best practices, and are scalable, reliable, and maintainable.
7-9 years of experience 1 Snowflake basics Tables, Views, SP, tasks, warehouse, data ingestion2 Good SQL hands on knowledge Joins, Null handling, aggregations, case statements
Continuous Improvement: Identify areas for process improvement and implement best practices to enhance project delivery efficiency.
Stakeholder management: You will be working very closely with end clients and in most of the times senior stake holder.
1. You have experience in running large data programs and have successfully implemented multiple complex data migration programs to any of the cloud hyperscaler like AWS , Azure or GCP.
2. You have very good understanding and have expereince of implementing topics like data mesh, data as a product, data fabric. You should be able to demonstrate your experience in the above mentioned areas.
3. You have experience doing core development work in any of the programing languages, additional advantage if you have experience in PySpark or Scala.
4. You have strong experience in data modelling, data management, data integration and data security.
5. You have experience in data architecture and database management, with a proven track record of designing and implementing data solutions.