Related skills
sql python gcp data modeling data pipelinesπ Description
- Build and manage business data pipelines; transform Firefox telemetry into high-quality datasets.
- Partner with data scientists, product, and marketing to turn datasets into models and metrics.
- Ensure data quality with observability tools; join weekly triage to fix issues.
- Collaborate with data and platform engineers to improve workflows.
- Maintain data governance and privacy across tens of TB of data daily.
π― Requirements
- 4+ years data engineering experience.
- Proficiency in SQL and Python; interest in AI-enabled data workflows.
- Experience mapping complex business processes into analytical data models.
- Strong software engineering fundamentals; modular, reusable code; scalable processing.
- Own projects end-to-end; manage blockers; keep team informed.
- Experience with GCP; distributed systems, DBs, queues, and batch/stream processing.
π Benefits
- Generous performance-based bonus plans
- Rich medical, dental, and vision coverage
- Generous retirement contributions with 100% immediate vesting
- Quarterly all-company wellness days
- Country-specific holidays plus a day off for your birthday
- One-time home office stipend
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!