Related skills
snowflake sql python dbt airflowπ Description
- Define architecture of scalable data pipelines powering analytics
- Establish engineering standards for performance and cost efficiency
- Design data models and transformation layers for trusted datasets
- Own complex data integration, set patterns for consistency and scale
- Architect data compute/storage platforms for performance, scalability, cost
- Govern data security, privacy, and compliance with security teams
π― Requirements
- 6+ years designing and scaling data platforms
- Expert SQL and Python
- Ownership of data stack: Airflow, Snowflake, dbt
- Cloud-based data systems with focus on scalability and cost
- Experience enabling AI/ML data foundations
π Benefits
- Health, dental, vision coverage
- 401K with company match
- Parental leave and wellness programs
- Unlimited vacation policy
- Salary transparency and competitive range
- Career growth and inclusive culture
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!