Databricks Data Engineer - Senior Consultant
Deloitte3 months ago
Los Angeles, CA, United States
Hybrid
Full-time
Junior Level (1-3 years)
Job Description
Position Overview
As a Senior Data Engineer, you will oversee the end-to-end design, deployment, and optimization of enterprise-scale data engineering solutions using Databricks on AWS, Azure, or GCP. This highly strategic role focuses on leading innovation in big data architecture and analytics, shaping best practices, advising senior stakeholders, and ensuring that data solutions align with business objectives and drive measurable results. Recruiting for this role ends on 2/13/2026.
Compensation: $136,700 - $188,900
Key Responsibilities
- Architect and Deliver Solutions: Lead the development, implementation, and scaling of advanced data engineering solutions using Databricks across AWS, Azure, or GCP environments.
- Champion Best Practices: Establish, document, and promote best-in-class approaches for data architecture, integration, and modeling.
- Pipeline Ownership: Oversee the design, development, and maintenance of robust data pipelines and architectures that support large-scale, enterprise data needs.
- Drive Excellence: Initiate and manage efforts to improve data quality, operational efficiency, and process scalability.
- Technology Leadership: Evaluate, pilot, and integrate new big data and analytics technologies to ensure the organization remains at the cutting edge.
- Strategic Data Governance: Consult on, design, and implement governance, security, and compliance strategies tailored to modern cloud data ecosystems.
- Team Leadership and Mentoring: Lead, coach, and develop teams of data engineers and architects, fostering technical growth and effective project delivery.
- Stakeholder Engagement: Communicate technical concepts and business value to diverse stakeholders, including executives and business leads.
- DevOps and Automation: Oversee the implementation of CI/CD practices with tools such as Azure DevOps, AWS Code Pipeline, Jenkins, TFS, or PowerShell for streamlined deployments.
Required Qualifications
- Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field (Master's preferred).
- Experience Required: 5+ years of hands-on experience in data engineering with a strong focus on Databricks deployed on major cloud platforms (AWS, Azure, or GCP).
- Deep understanding of Lakehouse architecture, Apache Spark, Delta Lake, and related big data technologies.
- Advanced skills in data warehousing, 3NF, dimensional modeling, and enterprise-level data lakes.
- Experience with Databricks components including Delta Live Tables, Autoloader, Structured Streaming, Databricks Workflows, and orchestration tools (e.g., Apache Airflow).
- Expertise in designing and supporting incremental data loads, and in building metadata-driven ingestion/data quality frameworks using PySpark.
- Hands-on experience with Databricks Unity Catalog and implementing fine-grained security and access control.
- Proven track record in deploying code and solutions via automated CI/CD pipelines.
- Demonstrated leadership managing complex, cross-functional data projects and technical teams (minimum 1 year).
- Experience with performance optimization of data engineering pipelines, code, and compute resources.
- Ability to travel up to 50% based on business needs.
- Limited immigration sponsorship may be available.
Preferred Qualifications
- Comprehensive knowledge of the AWS, Azure, and GCP cloud ecosystems and associated big data stacks.
- Demonstrated skill in performance tuning and optimization within Databricks/Apache Spark environments.
- Stays current with the latest Databricks feature releases and platform enhancements.
- Exceptional communication and stakeholder management abilities, including interfacing with executive leadership.
- Experience with Databricks Lakeflow is a plus.
- Experience in AI/ML is a plus.
Benefits & Perks
- Benefits: Eligibility to participate in a discretionary annual incentive program and available accommodations for applicants with disabilities.
Required Skills
Data Architecture
CI/CD
Data Engineering
Cloud Platforms (AWS, Azure, GCP)
Databricks
Python
Delta Lake
Apache Spark