Related skills
redshift sql python databricks dbtπ Description
- Develop data pipelines with SQL and Python ETL to deliver data to consumers.
- Collaborate with data scientists, analysts, and engineers to define data use cases.
- Architect and QA performant data models for analytics in a data warehouse.
- Triage and debug on-call requests with the Analytics Engineering team.
- Design LLM-friendly DWH schemas and domain guides for AI-driven analysis.
- Partner with Analytics leadership to plan BI tooling for easy data access.
π― Requirements
- 3+ years in analytics, data, or software engineering.
- Experience with DBT, Databricks, or similar SQL-transformation tools.
- Bachelor's degree or higher in a quantitative field.
- Expert SQL; proficiency in Python.
- Experience building metrics and analyzing large datasets.
- Excellent verbal and written communication skills.
π Benefits
- Equity stake
- Flexible work environment; remote or in-office
- WFH stipend to support home office setup
- Unlimited PTO
- 401(k) matching
- Health, vision, dental, and life insurance
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!