Added
12 days ago
Location
Type
Full time
Salary
Salary not provided

Related skills

aws sql python dbt airflow

📋 Description

  • Understand clients' IT environments, apps, and goals.
  • Collect and manage large, varied data sets.
  • Collaborate with ML engineers to build data pipelines.
  • Define data models to integrate disparate data.
  • Design, implement, and maintain ETL/ELT pipelines.
  • Transform data using Spark, Trino, AWS Athena.
  • Develop and deploy Data APIs with Python (Flask/FastAPI).

🎯 Requirements

  • Real-time and batch data flow and warehousing (Airflow, Dagster, Kafka)
  • Experience with AWS
  • Python and SQL for data engineering
  • IaC: Terraform or CloudFormation
  • Experience building scalable APIs
  • Data Governance: quality, discovery, lineage, security
  • English: upper-intermediate or higher
  • Ownership mindset, proactive problem-solving

🎁 Benefits

  • Training with AWS cert support
  • Access to latest AI tools and subscriptions
  • Long-term B2B collaboration
  • 100% remote with flexible hours
  • International cross-functional team
  • Medical insurance or budget for care
  • Paid sick leave, vacation, holidays
  • Equipment and tech for productive work
  • Gifts for weddings/childbirth/personal milestones
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →