Role: Data Engineer (Databricks + Azure & Python)Location: SydneyFulltime (Permanent)
Job Description:Design, build, and optimize scalable data pipelines using Azure Data Services and Databricks.Develop robust ETL/ELT workflows using PySpark and Azure Data Factory.Write efficient and reusable Python code for data transformation, ingestion, and orchestration.Manage and optimize Delta Lake tables for performance and reliability.Integrate various Azure services (Data Lake, Synapse, Event Hubs, Key Vault) for end-to-end solutions.Implement and monitor data quality, lineage, and governance standards.Collaborate with data scientists and analysts to ensure data availability and usability.Ensure CI/CD pipelines for data workflows using Azure DevOps or GitHub Actions.Optimize Databricks clusters for cost and performance efficiency.Stay up to date with evolving Azure and Databricks features and recommend improvements
Interested Candidates can share their updated resumes on sourabh.sood@carecone.com.au or can reach me on +61 290 559 949.