Related skills
sql python gcp airflow gkeπ Description
- Be mentored by one of our outstanding performance team member along a 30/60/90 plan designed only for you
- Participate in data modelling reviews and discussions to validate the model's accuracy, completeness, and alignment with business objectives
- Design, develop, deploy and maintain ELT/ETL data pipelines from a variety of data sources (transactional databases, REST APIs, file-based endpoints)
- Serve hands-on delivery of data models using solid software engineering practices (eg. version control, testing, CI/CD)
- Manage overall pipeline orchestration using Airflow (hosted in Cloud Composer), as well as execution using GCP hosted services such as Container Registry, Artifact Registry, Cloud Run, Cloud Functions, and GKE
- Work on reducing technical debt by addressing code that is outdated, inefficient, or no longer aligned with best practices or business needs
π― Requirements
- 5+ years of data/analytics engineering experience building, maintaining & optimising data pipelines & ETL processes on big data environments
- Proficiency in Python and SQL
- Knowledge of software engineering practices in data (SDLC, RFC...)
- Stay informed about the latest developments and industry standards in Data
- Fluency in English
- As a plus: Experience with our modern Data stack tools; Dimensional modelling/data warehousing concepts knowledge; Spanish language
π Benefits
- Competitive salary and benefits package
- Discretionary bonus based on performance
- Continued personal development through training and certification
- We are Open Source friendly, following Open Source principles in our internal projects and encouraging contributions to external projects
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!