Added
16 days ago
Type
Full time
Salary
Salary not provided

Related skills

dynamodb aws snowflake s3 python

πŸ“‹ Description

  • Design, build, and maintain production-grade data pipelines using Airflow and AWS.
  • Own data ingestion from internal systems and third-party integrations.
  • Manage data storage and movement across S3, Snowflake, Snowpipe, and DynamoDB.
  • Write and maintain custom Python code that runs in production.
  • Work across dev, staging, and production with deployment and rollback practices.
  • Partner with analytics, data science, and product teams to design reliable data models.

🎯 Requirements

  • Strong experience building and maintaining production data pipelines.
  • Deep comfort with Python for data engineering (not just scripting).
  • Hands-on Airflow experience in production environments.
  • Experience with AWS, including S3 and managed services.
  • Strong SQL skills; Snowflake experience preferred.
  • Experience operating across dev/prd environments.

🎁 Benefits

  • Opportunity to work from home
  • Excellent work environment
  • Medical, dental, and vision insurance
  • Up to 15 days of paid time off
  • 11 company observed holidays
  • 8 weeks of paid parental leave
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’