Related skills
aws sql python data modeling airflowπ Description
- Design, build, and maintain scalable data pipelines for analytics-ready data
- Develop clean, structured data models standardizing business metrics
- Implement data quality checks, testing frameworks, and monitoring systems
- Build and maintain integrations with APIs, databases, and third-party tools
- Design and manage the data warehouse architecture for scalability and performance
- Optimize pipeline performance, cost efficiency, and data processing workflows
π― Requirements
- 3β6+ years of experience in Data Engineering or similar roles
- Strong Python and advanced SQL skills
- Experience building and maintaining production-grade data pipelines
- Solid understanding of data warehousing concepts and data modeling
- Experience with APIs, cloud (AWS preferred), and modern data stacks
- Nice to have: BigQuery, Snowflake, Redshift, dbt, Airflow
π Benefits
- Remote-first role open to candidates anywhere in the U.S.
- Opportunity to work on high-impact initiatives
- Rapid learning and growth opportunities
- Health, wellness, and professional goals benefits
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!