Related skills
aws sql kubernetes databricks airflowπ Description
- Build and optimise the data platform to transform raw data into insights
- Create data sources and pipelines with third-party integrations and APIs
- Deliver data projects in collaboration with business stakeholders
- Develop and deploy ML infrastructure for AI use cases
- Integrate back-end databases with the data platform
- Shape the team's direction and coach members on best practices
π― Requirements
- Passion for decarbonisation
- Write high quality code and lean processes
- Experience with distributed data processing
- Experience with monitoring, testing and data quality
- Ability to work with ambiguity and own problems end-to-end
- Desirable: Airflow, AWS, Kubernetes, Spark, dbt, Terraform
π Benefits
- Salary discussed on call to match experience
- Dog friendly offices
- Autonomy and fast moving culture
- Access to Octopus Employee Benefits hub
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!