Related skills
snowflake python databricks scala dbt๐ Description
- Design end-to-end data solutions: data lakes, warehouses, ETL/ELT, APIs.
- Architect scalable, low-latency pipelines with Kafka, Flink, Spark.
- Design and orchestrate automated workflows with Airflow.
- Develop scalable data architecture for analytics, ML, real-time processing.
- Define data governance, metadata, and data quality standards.
- Partner with business and IT to translate needs into data architecture.
๐ฏ Requirements
- 5+ years of experience in data architecture or data engineering.
- Proficient with Spark, Kafka, Airflow.
- Cloud experience with Azure or Google Cloud and cloud-native data services.
- Python and Scala; strong SQL and data modeling.
- Experience with dbt, Snowflake, and Databricks.
- Knowledge of data governance, metadata, data quality, and security.
๐ Benefits
- Comprehensive rewards package.
- Flexible hours and birthday off.
- Access to cutting-edge technology and global projects.
- Inclusive culture with active networks.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!