Related skills
redshift terraform snowflake python databricksπ Description
- Design and implement robust data pipelines into the data lake
- Build secure, compliant data warehousing for healthcare
- Partner with analytics, data science, and engineering teams
- Foster data as a strategic asset across the organization
π― Requirements
- 4+ years designing and implementing large-scale data systems
- Strong Python skills; tooling (pip, poetry); pytest/pydantic
- Strong PySpark experience (DataFrame API) and optimization
- Experience with Databricks; Redshift or Snowflake is a plus
- Architectural patterns for high-volume ETL pipelines
- Data modeling, Medallion architecture, metrics, docs
- Excellent verbal and written communication
- BA/BS in CS/Engineering or equivalent
π Benefits
- Equity with comprehensive benefits
- Healthcare coverage
- Monthly wellness stipend
- Retirement savings match
- Lifetime Headspace membership
- Generous parental leave
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!