Related skills
postgresql python dbt flask data lakehouseπ Description
- Design, build, and optimize ETL/ELT pipelines using dbt
- Develop and maintain scalable data infra across PostgreSQL, Cosmos DB, and Azure services
- Manage Data Lakehouse architecture, incl. Apache Iceberg
- Build Python-based APIs and microservices (Flask or similar)
- Design backend for real-time and batch data access
- Work with distributed query engines (Trino) to analyze large datasets
π― Requirements
- 5+ years in data engineering or related roles
- Strong Python skills with production-grade APIs and microservices
- Deep expertise in SQL and PostgreSQL (schema design, performance)
- Hands-on experience with dbt for data transformation and pipeline development
- Experience with large-scale data systems and data lake environments
- Familiarity with Azure services including Cosmos DB and Trino
π Benefits
- Comprehensive medical, vision and dental benefits
- Flexible Spending Accounts (FSA)
- 401K with matching contributions
- Generous sick, vacation, and holiday benefits
- Gym membership contribution
- Internal referral program
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!