Added
1 minute ago
Location
Type
Full time
Salary
Salary not provided

Related skills

sql python graphql ci/cd airflow

📋 Description

  • Design data mesh architecture exposing domain data via GraphQL APIs.
  • Build and maintain an AWS data lake on S3, Glue, Lake Formation, and Redshift.
  • Develop and optimize ETL/ELT pipelines with AWS Glue and PySpark.
  • Implement AWS DMS pipelines to replicate data into Aurora PostgreSQL for real-time analytics.
  • Support data governance, quality, observability, and API design best practices.
  • Collaborate with product, engineering, and analytics teams to deliver robust data solutions.

🎯 Requirements

  • Bachelor’s degree in Computer Science, Mathematics, or related field.
  • At least 4 years of data engineering experience with ownership of complex data systems.
  • Strong experience with AWS data lake technologies (S3, Glue, Lake Formation, Athena, Redshift).
  • Understanding of data mesh principles and decentralized data architecture.
  • Proficiency in Python, SQL.
  • Experience with data modeling, orchestration tools (Airflow), and CI/CD pipelines.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →