Related skills
docker snowflake sql python kubernetesπ Description
- Build and maintain scalable data ingestion infrastructure and ops systems.
- Focus on the EL portion of the ELT stack; collaborate with analytics engineers.
- Develop robust, config-driven, event-driven pipelines for large datasets.
- Migrate data incrementally from RDS and DynamoDB into Snowflake using Kinesis or Airbyte.
- Create staging-layer architecture supporting medallion design.
- Remote work: private space and fast internet; occasional on-site meetings.
π― Requirements
- 4-6+ years in data engineering focusing on ingestion and infrastructure.
- Python and SQL proficiency with production-grade pipelines.
- Ingestion tools such as Kinesis, Airbyte, Kafka or similar.
- Hands-on Snowflake experience and moving data from ops DBs to cloud warehouse.
- AWS services knowledge including S3, Lambda, Step Functions and RDS.
- Docker and Kubernetes experience for maintainable deployments.
π Benefits
- Dynamic role in a growing startup.
- Opportunity to shape healthcare data products.
- Strong learning and growth opportunities.
- Collaborative cross-functional teams.
- Remote-friendly with occasional on-site meetings.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!