Related skills
bigquery looker etl sql pythonπ Description
- Partner with data science, SRE, and FinOps to translate requirements into scalable data solutions.
- Build and maintain ETL/ELT pipelines ingesting data from disparate systems using GCP services.
- Implement and monitor data quality checks, testing, and observability for timely datasets and dashboards.
- Support incident triage for pipelines; troubleshoot issues and document root causes.
- Develop user-facing data assets (curated tables, views, Looker dashboards) for cloud cost and operational reporting.
- Leverage AI tools to accelerate development, testing, and analysis for faster outcomes.
π― Requirements
- At least 2 years of experience with large-scale data in industry or relevant research.
- Strong SQL skills; relational databases; dimensional data modeling and query tuning.
- Hands-on experience with GCP BigQuery and Dataflow/Dataproc/Data Fusion.
- Proficiency with Python and common data libraries/frameworks.
- Experience building and operating ETL/ELT pipelines; API integration; structured and semi-structured data.
- Git, CI/CD workflows, and Infrastructure as Code concepts.
π Benefits
- Benefits overview and perks on Box Poland page.
- Hybrid work: office 3 days per week (Tue-Thu) in Warsaw.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!