Related skills
sql python databricks data modeling airflow📋 Description
- Build large, scalable analytics pipelines with modern data tech.
- Define data assets (data models) and Spark SQL jobs to populate them.
- Collaborate on long-term Data Platform architecture for efficiency and scalability.
- Conceptualize and own data architecture for multiple large-scale projects; evaluate tradeoffs.
- Work with engineers, product managers, and data scientists to translate data needs into insights.
- Design, build, and launch data models and visualizations across domains.
🎯 Requirements
- 5+ years Spark, Python, Java, C++, or Scala development experience.
- 5+ years SQL experience.
- 5+ years schema design, dimensional modeling, and medallion architectures.
- Experience with the Databricks platform and data lake architectures.
- Excellent product thinking and ability to influence cross-functional teams.
- BS degree in Computer Science or related technical field, or equivalent.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!