Job Location : Chennai, Pune, Noida, Kochi, Bangalore, Trivandrum
Experience : 5 Yr
CTC Budget : 1500000 to 2400000
Posted At : 31-Oct-2025
Key Responsibilities
1.Develop and manage Azure Data Factory (ADF) pipelines to ingest data from legacy and cloud systems into the Data Lake.
2.Build and optimize Databricks notebooks for ingestion and curation, enabling semantic views for reporting and event streaming to messaging queues.
3.Optimize workflows for performance and cost-efficiency, including job scheduling, cluster sizing, and PySpark / SQL code tuning.
4.Develop and manage Unity Catalog objects, ensuring proper access control and data organization.
5.Ensure compliance with PEP data governance and security standards across all data workflows.
6.Participate in code reviews, documentation, and DevOps deployments using Azure DevOps.
7.Integrate with various systems such as Snowflake, Salesforce, Oracle, SAP, ASQL, PostgreSQL, messaging queues, ADLS, API's and Unity Catalog for data ingestion and publishing.
8.Contribute to solution design, job optimization, and data workflow architecture.
9.Support migration efforts, including the transition of Teradata ETL logic to Azure Databricks and Unity Catalog.
Technical Skills & Tools
1.Cloud & Data Engineering: Azure Data Factory, Azure Databricks, Azure Logic Apps, Azure Functions, AKS, Event Hubs, Kafka, Power Automate etc.
2.Data Sources: Teradata, Sybase, SAP, Salesforce, Oracle, PostgreSQL, Snowflake, SharePoint
3.Languages & Frameworks: PySpark, SQL, Python
4.Data Governance: Unity Catalog, AAD integration, access control
5.DevOps & CI/CD: Azure DevOps, ARM templates
6.Visualization & Reporting: Exposure to Power BI, Web Apps.
7.Streaming & Real-Time: Event Hubs, Kafka, Autoloader et