Related skills
java python scala hadoop airflow📋 Description
- Design, build and manage data pipelines integrating user event data.
- Develop canonical datasets to track key product metrics.
- Collaborate with cross-functional teams to meet data needs.
- Implement robust, fault-tolerant data ingestion and processing.
- Participate in data architecture and engineering decisions.
- Ensure data security, integrity, and compliance.
🎯 Requirements
- 3+ years as a data engineer and 8+ years in software engineering.
- Proficiency in Python, Scala or Java.
- Experience with distributed processing tech (Hadoop, Flink) and storage (HDFS, S3).
- Expertise with ETL schedulers (Airflow, Dagster, Prefect) or similar.
- Solid Spark knowledge: write, debug and optimize Spark code.
- Hybrid work model with in-person collaboration.
🎁 Benefits
- Equity offered.
- Hybrid work model with in-person collaboration.
- Reasonable accommodations for applicants with disabilities.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!