Related skills
bash etl sql python airflow๐ Description
- Own core data pipelines; scale data processing to meet Lyft's growth.
- Evolve data models and schemas with business needs.
- Implement and maintain systems to monitor data quality.
- Develop tools for self-service ETL and faster data processing.
- Write clean, scalable, cost-efficient code.
- Conduct code reviews to uphold quality and share knowledge.
๐ฏ Requirements
- 5+ years in data engineering or related field.
- Strong SQL; experience with Trino or Spark/PySpark.
- Build and optimize complex data models and ETL pipelines.
- Hands-on workflow tools experience (Airflow or similar).
- Proficiency in Python or Bash.
- Collaborate with data analytics/science and engineering teams.
๐ Benefits
- Hybrid workโ3 days/week in Toronto office.
- Flexibility to work from anywhere up to 4 weeks per year.
- In-office perks available.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!