Related skills
bigquery snowflake sql python gcpπ Description
- Design, build, and maintain scalable data pipelines for analytical workloads.
- Develop and optimize ETL/ELT pipelines from multiple sources.
- Collaborate with backend and platform engineers to integrate data pipelines.
- Manage and optimize cloud data warehouses, primarily BigQuery.
- Implement data governance, security, and privacy best practices.
- Collaborate with analytics teams to define data models for self-service BI.
π― Requirements
- Strong English (written and verbal) required.
- 5+ years in data engineering building scalable data pipelines.
- Strong SQL for data modeling, transformation, and optimization.
- Expertise in Python for data processing and pipeline development.
- Experience with GCP, BigQuery, Cloud Storage, and Pub/Sub.
- Familiarity with DBT, Dataflow, Apache Beam; Dagster/Airflow.
π Benefits
- Great projects with global brands.
- Inclusive, safe workplace culture.
- Generous training budgets and certifications.
- Flexible vacation and work-life balance.
- Customizable extended health and dental plans.
- Remote/hybrid-friendly work arrangements.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!