Related skills
bigquery terraform snowflake sql pythonπ Description
- Architect scalable, fault-tolerant ELT/ETL pipelines.
- Lead data team standards, code reviews, and practices.
- Design optimized data models (Data Vault, Kimball) in cloud data warehouse.
- Implement and manage data infra with Terraform or CloudFormation.
- Establish data quality, monitoring, testing (dbt, Great Expectations).
- Collaborate with Product, Data Science, and Eng to power customer features.
π― Requirements
- 5+ years of professional data engineering experience.
- SQL and cloud data warehouses (Snowflake, Databricks, BigQuery) with tuning.
- Deep Python for building and optimizing data processing apps.
- Experience with Airflow/Prefect/Dagster and dbt.
- Streaming tech: Kafka, Kinesis, Spark Streaming.
- DataOps, CI/CD, and IaC practices.
π Benefits
- Hybrid work model with flexible arrangements.
- Health and welfare benefits including paid leave.
- Generous paid time off and benefits package.
- Opportunities for career growth and development.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!