Added
10 minutes ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
aws etl gcp dbt airflow๐ Description
- Design, build, and maintain scalable data pipelines and infrastructure
- Develop batch and streaming systems for large-scale data
- Own core data platform components (warehouse, lake, orchestration, tooling)
- Implement data modeling, transformation, and versioning frameworks
- Ensure data quality, observability, and reliability across pipelines
- Partner with data scientists and engineers to productionize ML workflows
๐ฏ Requirements
- 3โ7+ years in data infrastructure or platform engineering
- Real-time and batch data processing (Spark, Kafka, etc.)
- Modern data tooling: dbt, Airflow, etc.
- Data modeling, ETL/ELT, reverse ETL, and orchestration
- Design for scalability, reliability, and observability
- Cloud platforms (AWS/GCP) and storage systems
๐ Benefits
- Equity
- Health, dental, vision coverage
- Flexible PTO
- Opportunity to join early and shape the foundation
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!