Related skills
redshift terraform aws sql airflowπ Description
- Data Infrastructure Orchestration for Data Platform (AWS, Terraform)
- Data Pipelines - Build pipelines with Spark, Python, PySpark, SQL
- Data Modeling - Design logical/physical schemas and data models
- Cross-functional collaboration with Product, Eng, Data Science, Analytics/BI, and Ops
- Own data quality across healthcare claims and member experience
- Manage BI development lifecycle from semantic models to visualization
- Enable fellow developers to self-service their data needs
- Leverage best practices to build the next-gen data ecosystem
π― Requirements
- BS degree in Computer Science or related field, or equivalent experience
- 4+ years as data engineer with Scala, Python/PySpark, and SQL
- 4+ years schema design, dimensional modeling, and data warehousing
- Expertise in ETL design, implementation and maintenance
- Experience with Spark, Presto, Hive, Redshift; Airflow is a plus
- Excellent communication with stakeholders across teams
π Benefits
- Hybrid work schedule at Plano office (in-office 3 days/week)
- 115,000 stock options
- Health insurance
- 401(k) plan
- Paid time off
- Professional development and mentorship
- Flexible work arrangements
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!