Related skills
dynamodb aws snowflake s3 pythonπ Description
- Design, build, and maintain production-grade data pipelines using Airflow and AWS.
- Own data ingestion from internal systems and third-party integrations.
- Manage data storage and movement across S3, Snowflake, Snowpipe, and DynamoDB.
- Write and maintain custom Python code that runs in production.
- Work across dev, staging, and production with deployment and rollback practices.
- Partner with analytics, data science, and product teams to design reliable data models.
π― Requirements
- Strong experience building and maintaining production data pipelines.
- Deep comfort with Python for data engineering (not just scripting).
- Hands-on Airflow experience in production environments.
- Experience with AWS, including S3 and managed services.
- Strong SQL skills; Snowflake experience preferred.
- Experience operating across dev/prd environments.
π Benefits
- Opportunity to work from home
- Excellent work environment
- Medical, dental, and vision insurance
- Up to 15 days of paid time off
- 11 company observed holidays
- 8 weeks of paid parental leave
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!