Related skills
sql python gcp data modeling data pipelinesπ Description
- Build and manage business data pipelines; transform Firefox telemetry into reliable datasets.
- Partner with data scientists, product, and marketing to turn datasets into actionable models.
- Be guardian of data quality; ensure datasets stay accurate with observability tools and weekly triage.
- Evolve the platform by partnering with data and platform engineers to improve our workflows.
- Maintain data integrity; enforce governance and privacy across terabytes of data daily.
π― Requirements
- 4+ years of professional data engineering experience.
- SQL and Python proficiency; eager to integrate AI into data workflows.
- Data modeling experience mapping complex processes to extensible analytical models.
- Strong software engineering fundamentals; modular, reusable code; scalable for big data.
- Self directed with end-to-end project ownership and clear communication across time zones.
- Experience with cloud distributed systems (GCP) and distributed databases, message queues, or batch/stream processing.
π Benefits
- Generous performance-based bonus plans.
- Rich medical, dental, and vision coverage.
- Generous retirement contributions with 100% immediate vesting.
- One-time home office stipend.
- Annual professional development budget.
- Quarterly well-being stipend.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!