Added
7 days ago
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
sql python dbt airflow kafkaπ Description
- Own data architecture end-to-end for critical business data.
- Build and operate streaming and batch pipelines across domains.
- Design domain-oriented canonical data models for analytics and ML.
- Enforce data quality with tests, lineage, monitoring, and reconciliation.
- Automate operational workflows across services and warehouses.
- Enable insights via semantic layers, APIs, and real-time queries.
π― Requirements
- 3+ years as a data or software engineer building data warehouses or distributed systems.
- Design and implement data models using dimensional, Data Vault, or ledger-style techniques.
- Hands-on with Kafka, Debezium; dbt, Spark, Flink; Dagster, Airflow.
- Operated cloud data warehouses (Snowflake, BigQuery, Redshift) incl. schema design and cost optimization.
- Proficient in Python/SQL; CI/CD and infrastructure-as-code.
- Collaborate across engineering, product, analytics to translate needs into data systems.
π Benefits
- Generous Holiday and Time off Policy
- Health Insurance options including Medical, Dental, Vision
- Work From Home Support
- Home office setup allowance
- Monthly allowance for cell phone and internet
- Parental Leave and family planning benefits
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!