Related skills
bigquery sql python typescript airflowπ Description
- Design, develop, and maintain scalable data pipelines on GCP (Dataform, BigQuery)
- Optimize data processing and storage for performance and cost
- Write data quality tests to ensure data reliability
- Work with data teams to implement complex data models and ETL workflows
- Mentor junior engineers; contribute to docs and architecture
- Bonus: BI dashboards, ML concepts, AI governance
π― Requirements
- Bachelor's degree in CS, Data Science, Engineering, or related field (or equivalent)
- 4+ years of data engineering experience
- Experience with GCP data services (BigQuery, Dataflow, Dataform, Pub/Sub)
- Experience setting up CI/CD pipelines for data processing
- Experience with data modeling and ETL processes
- Strong understanding of how data is applied to solve business problems
π Benefits
- Teamwork and recognition programs
- Flexibility: work from home and office
- Growth: access to tools, technologies and learning opportunities
- Wellbeing: insurance plans and parental benefits
- Comprehensive rewards: strong financial foundation and rewards
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!