Slalom Flex (Project Based) - Data Engineer
Slalom7 months ago
San Diego, California, United States
Hybrid
Contract
Junior Level (1-3 years)
Job Description
Position Overview
As a Data Engineer at Slalom, you will design and develop scalable data pipelines for ingestion, transformation, and integration of large data sets. Based in locations such as Los Angeles, Orange County, Phoenix, Austin, New Jersey, and Georgia, you’ll collaborate with innovative teams to build and optimize data architectures on cloud platforms like Azure, GCP, or AWS while ensuring high data quality, reliability, and integrity.
Key Responsibilities
- Design and develop scalable data pipelines for ingestion, transformation, and integration of large data sets
- Build and optimize data architectures, including lakehouse/data warehouse solutions on Azure, GCP, or AWS
- Implement and manage cloud-based ETL/ELT workflows using tools such as Azure Data Factory or GCP Dataflow
- Ensure data quality, reliability, and integrity through rigorous testing, validation, and monitoring
- Automate data workflows and infrastructure using scripting and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
- Monitor pipeline performance and troubleshoot issues as they arise
Required Qualifications
- BS Degree with 3+ years of hands-on data engineering experience in a professional setting
- Proficiency with cloud platforms such as AWS, Azure, or GCP
- Strong skills in SQL and one or more programming languages (e.g., Python, Scala, or Java)
- Experience with cloud data warehouses like Snowflake, Databricks, BigQuery, Redshift, or Postgres
- Familiarity with data build tools (dbt)
Benefits & Perks
- Compensation: $55/HR to $85/HR (actual compensation depends on skills, experience, qualifications, location, and other factors)
- Meaningful time off and paid holidays
- 401(k) with a match
- Highly subsidized health, dental, and vision coverage
- Adoption and fertility assistance
- Short/long-term disability
- Yearly $350 reimbursement for well-being-related expenses
- Discounted home, auto, and pet insurance
Required Skills
SQL
dbt
ETL/ELT Workflows
Cloud Platforms (AWS, Azure, GCP)
Infrastructure-as-Code (Terraform, CloudFormation)
Python
Data Warehousing (Snowflake, Databricks, BigQuery, Redshift, Postgres)
Data Pipelines