Related skills
azure aws sql python gcpπ Description
- Design, build, and deploy data pipelines across platforms incl. warehouses and real-time systems.
- Develop deep expertise in data pipelines and manage their SLAs for performance.
- Optimize data ingestion, transformation, and storage for performance and scalability.
- Collaborate with analysts, scientists and stakeholders to create internal data products.
- Implement data quality, observability and governance best practices.
π― Requirements
- Up to 5 years of experience including internship in tech.
- Bachelor's or Master's in CS/IT or related field.
- Proficient in SQL and Python; build reliable data ingestion pipelines.
- Experience collaborating with distributed teams across global time zones.
- Experience with AWS, GCP, or Azure and infrastructure as code.
- Strong written and verbal communication for technical audiences.
π Benefits
- Belonging across time zones and backgrounds.
- Equal opportunity workplace.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!