Added
4 hours ago
Type
Full time
Salary
Salary not provided

Related skills

dynamodb aws snowflake sql python

πŸ“‹ Description

  • Design, build, and maintain production data pipelines with Airflow and AWS.
  • Own data ingestion from internal systems and third-party integrations.
  • Manage data storage and movement across S3, Snowflake, Snowpipe, and DynamoDB.
  • Write and maintain Python code that runs in production.
  • Work across dev, staging, and production with deployment and rollback.
  • Partner with analytics, data science, and product teams to design data models.

🎯 Requirements

  • Strong experience building production data pipelines.
  • Deep Python proficiency for data engineering.
  • Hands-on Airflow in a production environment.
  • AWS experience including S3 and managed services.
  • SQL skills with Snowflake or similar warehouses.
  • Mentor junior engineers and contribute to best practices.

🎁 Benefits

  • Opportunity to work from home.
  • Medical, dental, and vision insurance.
  • Up to 15 days of paid time off.
  • 12 company observed holidays.
  • 401k plan with company match.
  • Life insurance.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’