Related skills
snowflake etl sql python databricksπ Description
- You will be part of a team with a large amount of ownership and autonomy.
- Large scope for company-level impact working on catalog data.
- You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
- You will ship high quality, scalable and robust solutions with a sense of urgency.
- You will have the freedom to suggest and drive organization-wide initiatives.
π― Requirements
- 6+ years in data/software engineering focused on data pipelines.
- Strong SQL expertise and Python proficiency.
- Experience building ETL/ELT pipelines.
- Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto.
- Strong cross-functional communication and ownership.
- Experience with large codebases on cross-functional teams.
π Benefits
- Flexible work locations including remote or office.
- Flex First remote work policy.
- New hire equity grant and annual refresh grants.
- Benefits offerings overview available.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!