Related skills
aws sql python dbt airflow📋 Description
- Understand clients' IT environments, apps, and goals.
- Collect and manage large, varied data sets.
- Collaborate with ML engineers to build data pipelines.
- Define data models to integrate disparate data.
- Design, implement, and maintain ETL/ELT pipelines.
- Transform data using Spark, Trino, AWS Athena.
- Develop and deploy Data APIs with Python (Flask/FastAPI).
🎯 Requirements
- Real-time and batch data flow and warehousing (Airflow, Dagster, Kafka)
- Experience with AWS
- Python and SQL for data engineering
- IaC: Terraform or CloudFormation
- Experience building scalable APIs
- Data Governance: quality, discovery, lineage, security
- English: upper-intermediate or higher
- Ownership mindset, proactive problem-solving
🎁 Benefits
- Training with AWS cert support
- Access to latest AI tools and subscriptions
- Long-term B2B collaboration
- 100% remote with flexible hours
- International cross-functional team
- Medical insurance or budget for care
- Paid sick leave, vacation, holidays
- Equipment and tech for productive work
- Gifts for weddings/childbirth/personal milestones
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!