Responsibilities

  • Design and develop ETL pipelines to extract, transform, and load data from SQL Server into Azure and Microsoft Fabric environments. 
  • Develop and manage data pipelines (incremental load, full load, and CDC) using ADF or PySpark
  • Implement Medallion Architecture and architect multi-layered data warehouses using Star and Snowflake Schemas to support enterprise BI needs.
  • Build reusable, metadata-driven, and parameterized pipelines in ADF with dynamic datasets and linked services for flexible integration.
  • Integrate ADF with Azure DevOps for CI/CD, version control, and automated release management workflows.
  • Maintain and optimize CDC mechanisms to ensure data quality, consistency, and near real-time synchronization.
  • Implement alerts, monitoring, and performance optimization in ADF and Fabric with detailed technical documentation.
  • Collaborate with stakeholders, mentor junior engineers, and lead technical reviews to align solutions with business and analytical needs.
  • Use Microsoft Fabric capabilities, including OneLake, Data Warehouse, and Data Pipelines, to enhance your data architecture and streamline workflows.
  • Certifications in Azure Data Engineering and Fabric Data Engineering are preferred.