Implement and optimize ELT processes, source-to-target mapping, and transformation logic in tools like DBT Labs, Azure Data Factory, Databricks Notebooks, and Snow SQL.
Collaborate with data scientists, analysts, data engineers, report developers, and infrastructure engineers for end-to-end support.
Co-develop CI/CD best practices, automation, and pipelines with infrastructure engineers for code deployments using GitHub Actions.
Automate source-to-target mappings, data pipelines, and data lineage in Collibra.
Required Experience:
Hands-on experience building pipelines with ADF, Snowflake, Databricks, and DBT Labs.
Expertise in Azure Cloud, with integration experience involving Databricks, Snowflake, and ADLS Gen2.
Proficient in data warehousing and lakehouse concepts, including ELT processes, Delta Tables, and External Tables for structured/unstructured data.
Experience with Databricks Unity Catalog and data-sharing technologies.
Strong skills in CI/CD tools (Azure DevOps, GitHub Actions) and version control systems (GitHub).
Proven cross-functional collaboration and technical support experience for data scientists, report developers, and analysts.