Related skills
bigquery python kubernetes gcp dbtπ Description
- Design, build, and maintain data pipelines for Lighthouse.
- Design streaming and batch pipelines powering products and analytics.
- Improve data quality, latency, and reliability.
- Collaborate with data consumers to understand needs and deliver data solutions.
- Mentor engineers and share your expertise.
- Bridge data and business to unlock value across teams.
π― Requirements
- Experience building scalable data pipelines in a data engineering role.
- Proficiency in Python for data processing.
- Strong knowledge of cloud databases (BigQuery, Snowflake, Databricks).
- Experience with streaming systems (Kafka or Google Cloud Pub/Sub).
- Familiarity with data governance/quality tools (Atlan, Soda).
- Excellent communication and stakeholder management.
- Mentoring other engineers.
π Benefits
- Flexible time off: Autonomy to manage your work-life balance.
- Alan Flex benefits: 160β¬/month for food or nursery.
- Flexible retribution: Tax-free payroll deductions for benefits.
- Wellbeing support: Subsidized ClassPass subscription.
- Comprehensive health insurance: 100% coverage for you and dependents.
- Impactful work: Shape products used by 85,000+ users worldwide.
- Referral bonuses: Earn rewards for bringing in new talent.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!