Job Summary: We are seeking a highly skilled Data Engineer with a minimum of 5 years of hands-on experience in Informatica Intelligent Data Management Cloud (IDMC or IICS). The successful candidate will design, implement, and maintain scalable data integration solutions using Informatica Cloud services. Experience with CI/CD pipelines is required to ensure efficient development and deployment cycles. Familiarity with Informatica Catalog, Data Governance, and Data Quality is considered a strong advantage.
Key Responsibilities:
Design, build, and optimize end-to-end data pipelines using Informatica IDMC (or IICS), including Cloud Data Integration and Cloud Application Integration.
Implement ETL/ELT processes to support data lakehouse, and EDW use cases.
Develop and maintain CI/CD pipelines to support automated deployment and version control.
Work closely with data architects, analysts, and business stakeholders to translate data requirements into scalable solutions.
Monitor job performance, troubleshoot issues, and ensure compliance with SLAs and data quality standards.
Document technical designs, workflows, and integration processes following best practices.
Collaborate with DevOps and cloud engineering teams to ensure seamless integration within the cloud ecosystem.
Required Qualifications:
Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field.
Minimum 5 years of hands-on experience with Informatica IDMC.
Experience in building and deploying CI/CD pipelines using tools such as Git, or Azure DevOps.
Proficient in SQL, data modeling, and transformation logic.
Experience with cloud platforms (Azure or OCI).
Strong problem-solving skills in data operations and pipeline performance.
Preferred / Nice-to-Have Skills:
Experience with Informatica Data Catalog for metadata and lineage tracking.
Familiarity with Informatica Data Governance tools such as Axon and Business Glossary.
Hands-on experience with Informatica Data Quality (IDQ) for data profiling, cleansing.
Experience developing data pipelines using Azure Data Factory.