Related skills
redshift postgresql python dbt sagemaker๐ Description
- Design, build, and maintain scalable Python data pipelines.
- Develop reusable data services for ingestion and ML deployment.
- Productionize ML workflows with SageMaker/Vertex AI.
- Implement monitoring, testing, and CI/CD for data pipelines.
- Own real-time and batch data integrations across core systems.
- Mentor engineers and contribute to documentation.
๐ฏ Requirements
- 5+ years in data engineering/backend with Python.
- Clean, modular, tested Python code, production-ready.
- Strong data architecture, distributed systems, security best practices.
- Experience deploying ML workflows (SageMaker/Vertex AI).
- Familiarity with ELT tools (Fivetran) and DBT.
- SQL expertise with large analytics DBs (Redshift, PostgreSQL).
๐ Benefits
- Flexible schedules and unlimited time off.
- Fully paid health, dental, and vision for you and family; includes HRA.
- 2% employer-paid 401(k) with up to 6% match.
- Paid medical, family, and parental leave.
- Home-office setup allowance of $1,000 for new remote employees.
- Quarterly snack deliveries and digital subscriptions to Boston Globe & NYT.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!