This job is no longer available. Continue your job search here.
Database Administrator
Pune
Job No. atci-4338265-s1686217
Full-time
Job Description
Project Role : Database Administrator
Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance.
Must have skills : Data Modeling Techniques and Methodologies
Good to have skills : NA
Minimum 3 year(s) of experience is required
Educational Qualification : Graduate
Data Modelling: • Collaborate with cross-functional teams to understand business requirements and translate them into effective and scalable data models • Develop and maintain data models using industry-leading practices, with a strong emphasis on Data Mesh and Data Vault 2 methodologies • Ensure that data models align with standards and guidelines defined by Data architects and are adaptable to the evolving needs of the business • Responsible for the development of the conceptual, logical, and physical data models, the implementation of Data Mesh, Data Fabric on target platforms (Google Big Query) using ERWIN. • Domain Expertise: • Acquire a deep understanding of various business domains and their associated data, processes and systems, ensuring that data models are reflective of the domain-specific context and requirements • Data Mesh Implementation: • Work closely with the Data Mesh architecture principles to ensure the decentralised ownership and domain-oriented approach to data • Define and implement data products, aligning with the Data Mesh principles of domain-driven decentralized data ownership • Ensure that data is structured in order to easily conform to security controls and obligations, that relate to the data • Data Vault 2 Implementation: • Design and implement Data Vault 2.0 compliant data warehouses and hubs. • Ensure that the Data Vault model provides flexibility, scalability, and resilience in handling complex and evolving business requirements. • Ensure that every artifact built is optimised and monitored and that cost is always considered • Support, guide and mentor team members, in the domain • Collaboration: • Prior experience working in an agile squad environment, with minimal supervision. • Expert technical advice, presentations to and education of audiences (technical and business) within Enterprise Data and Architectures, and within the business, including data stewards and enterprise architects, regarding enterprise conformance and Data Vault modelling concepts • Collaborate with solution architects, data engineers, data scientists, and other stakeholders to understand data usage patterns, deal with production and Data Quality issues and optimize data models for performance. • Provide guidance and support to development teams in the implementation of data models within the Data Mesh and Data Vault 2 frameworks. • Documentation: • Create and maintain comprehensive documentation of data models, ensuring that they are accessible to relevant stakeholders. • Keep abreast of industry trends, emerging technologies, and best practices related to data modelling and integration. • Creation and maintenance of artefacts relating to data models (e.g. DDLs, mapping of data, DMLs, Data Dictionaried, Change Registers etc.) Other skills beneficial for the role are • Certification in Data Vault 2.0 or related technologies • Experience with tools such as Apache Kafka, Apache Flink, or similar data streaming platforms • Familiarity with Google Cloud Platform services or AWS Platform Services with respect to Data and AI/ML • Proficiency and experience with Erwin Data Modeller • Experience or exposure to data catalogues such as Collibra and Abinitio would be highly beneficial
Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance.
Must have skills : Data Modeling Techniques and Methodologies
Good to have skills : NA
Minimum 3 year(s) of experience is required
Educational Qualification : Graduate
Data Modelling: • Collaborate with cross-functional teams to understand business requirements and translate them into effective and scalable data models • Develop and maintain data models using industry-leading practices, with a strong emphasis on Data Mesh and Data Vault 2 methodologies • Ensure that data models align with standards and guidelines defined by Data architects and are adaptable to the evolving needs of the business • Responsible for the development of the conceptual, logical, and physical data models, the implementation of Data Mesh, Data Fabric on target platforms (Google Big Query) using ERWIN. • Domain Expertise: • Acquire a deep understanding of various business domains and their associated data, processes and systems, ensuring that data models are reflective of the domain-specific context and requirements • Data Mesh Implementation: • Work closely with the Data Mesh architecture principles to ensure the decentralised ownership and domain-oriented approach to data • Define and implement data products, aligning with the Data Mesh principles of domain-driven decentralized data ownership • Ensure that data is structured in order to easily conform to security controls and obligations, that relate to the data • Data Vault 2 Implementation: • Design and implement Data Vault 2.0 compliant data warehouses and hubs. • Ensure that the Data Vault model provides flexibility, scalability, and resilience in handling complex and evolving business requirements. • Ensure that every artifact built is optimised and monitored and that cost is always considered • Support, guide and mentor team members, in the domain • Collaboration: • Prior experience working in an agile squad environment, with minimal supervision. • Expert technical advice, presentations to and education of audiences (technical and business) within Enterprise Data and Architectures, and within the business, including data stewards and enterprise architects, regarding enterprise conformance and Data Vault modelling concepts • Collaborate with solution architects, data engineers, data scientists, and other stakeholders to understand data usage patterns, deal with production and Data Quality issues and optimize data models for performance. • Provide guidance and support to development teams in the implementation of data models within the Data Mesh and Data Vault 2 frameworks. • Documentation: • Create and maintain comprehensive documentation of data models, ensuring that they are accessible to relevant stakeholders. • Keep abreast of industry trends, emerging technologies, and best practices related to data modelling and integration. • Creation and maintenance of artefacts relating to data models (e.g. DDLs, mapping of data, DMLs, Data Dictionaried, Change Registers etc.) Other skills beneficial for the role are • Certification in Data Vault 2.0 or related technologies • Experience with tools such as Apache Kafka, Apache Flink, or similar data streaming platforms • Familiarity with Google Cloud Platform services or AWS Platform Services with respect to Data and AI/ML • Proficiency and experience with Erwin Data Modeller • Experience or exposure to data catalogues such as Collibra and Abinitio would be highly beneficial
Qualifications
Graduate
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.