Related skills
snowflake airflow kafka spark flink📋 Description
- Develop, optimize, and maintain data pipelines and data platform components.
- Implement data federation, semantic modeling, and governance frameworks.
- Collaborate with Engineering, Analytics, and Product teams to deliver data assets.
- Apply AI-driven tools to accelerate development and improve data quality.
- Investigate and adopt emerging data technologies aligned with strategy.
- Operate independently within distributed teams.
🎯 Requirements
- Snowflake expertise (highest priority).
- Kafka, Flink, Spark, and Airflow experience.
- Semantic modeling and data governance (Cube); data federation.
- Distributed systems and modern data engineering patterns.
- AI tools to support data engineering workflows.
- Strong communication and collaboration across remote teams.
🎁 Benefits
- Equal opportunity employer.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!