We are seeking a highly skilled Databricks Architect & Engineer to design, build, and optimize our enterprise data and analytics platform on Databricks. This role combines deep technical engineering expertise with strategic architecture leadership, enabling scalable, high-performance data solutions across the organization.
You will lead initiatives around data lakehouse architecture, Delta Lake pipelines, and Databricks platform governance, working closely with data engineers, data scientists, and cloud architects to deliver reliable, efficient, and secure data ecosystems.
Define and implement end-to-end Databricks architecture across environments (development, staging, production).
Design data lakehouse solutions leveraging Delta Lake, Unity Catalog, and Databricks SQL.
Develop standards, frameworks, and best practices for data ingestion, transformation, orchestration, and governance.
Collaborate with cloud architects to ensure seamless integration with Azure, AWS, or GCP infrastructure.
Architect cost-efficient cluster strategies, including autoscaling, job clusters, and performance tuning.
Build, optimize, and deploy data pipelines using PySpark, SQL, and Databricks workflows.
Implement CI/CD pipelines for Databricks using tools such as GitHub Actions, Azure DevOps, or Jenkins.
Develop and maintain data quality checks, monitoring, and alerting mechanisms.
Partner with data scientists to operationalize ML models using MLflow and Model Serving.
Manage platform automation and infrastructure-as-code using Terraform or similar tools.
Implement data security, lineage, and access control using Unity Catalog and compliance best practices.
Create and maintain documentation, playbooks, and reusable templates for the data engineering team.
Mentor engineers and analysts on Databricks features, coding standards, and performance optimization.
Collaborate with Databricks customer success and support teams to stay ahead of new releases and capabilities.