Related skills
postgresql sql python dbt trino📋 Description
- Design and optimize ETL/ELT pipelines with dbt for the data warehouse.
- Build and maintain data transformation infra with Azure Event Hub/Grid, PostgreSQL, and Cosmos DB.
- Manage Data Lakes with Apache Iceberg for scalable storage and processing.
- Use Trino for distributed SQL queries across diverse data sources.
- Develop Python-based data manipulation, scripting, and automation.
- Containerize and orchestrate workloads with Docker and Kubernetes.
🎯 Requirements
- 5+ years of relevant data engineering experience.
- Bachelor’s degree in a related field.
- Expert-level SQL with PostgreSQL schema design and optimization.
- Deep proficiency in dbt for modular ETL/ELT pipelines.
- Advanced Python skills with pandas and polars.
- Experience with Azure Event Hub/Grid and Azure Cosmos DB.
🎁 Benefits
- Comprehensive medical, vision, and dental benefits.
- Flexible Spending Accounts (FSA) for healthcare.
- 401K with employer matching contributions.
- Generous sick, vacation, and holiday benefits.
- Gym membership contribution.
- Internal referral program.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!