Related skills
dynamodb aws snowflake sql pythonπ Description
- Design, build, and maintain production data pipelines with Airflow and AWS.
- Own data ingestion from internal systems and third-party integrations.
- Manage data storage and movement across S3, Snowflake, Snowpipe, and DynamoDB.
- Write and maintain Python code that runs in production.
- Work across dev, staging, and production with deployment and rollback.
- Partner with analytics, data science, and product teams to design data models.
π― Requirements
- Strong experience building production data pipelines.
- Deep Python proficiency for data engineering.
- Hands-on Airflow in a production environment.
- AWS experience including S3 and managed services.
- SQL skills with Snowflake or similar warehouses.
- Mentor junior engineers and contribute to best practices.
π Benefits
- Opportunity to work from home.
- Medical, dental, and vision insurance.
- Up to 15 days of paid time off.
- 12 company observed holidays.
- 401k plan with company match.
- Life insurance.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!