Related skills
sql python databricks data modeling airflow📋 Description
- Define company data assets and data models with Spark/SQL
- Define long-term Data Platform strategy for scalable, reliable architecture
- Own data architecture for large-scale projects and evaluate design/operational costs
- Collaborate with engineers, product managers, and data scientists to translate data needs into solutions
- Design, build, and launch data models and visualizations for multiple use cases
- Optimize data pipelines, dashboards, and systems to accelerate data artifact development
🎯 Requirements
- 5+ years of Spark, Python, Java, C++, or Scala development experience
- 5+ years of SQL experience
- 5+ years of experience with schema design, dimensional data modeling, and medallion architectures
- Experience with the Databricks platform and data lake architectures for large-scale data processing and analytics
- Excellent product strategic thinking and communications to influence product and cross-functional teams by identifying data opportunities to drive impact
- BS degree in CS or related field; or equivalent experience; Experience designing, building and maintaining data processing systems
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!