Related skills
java sql python scala airflowπ Description
- Design and build scalable data infra for analytics.
- Develop and operate ETL pipelines for large datasets.
- Use Spark, Hive, and similar frameworks for processing.
- Model data with SQL and analytics-focused schemas.
- Write prod-grade code in Python, Java, Scala, or Go.
- Collaborate with analysts and stakeholders to provide reliable data.
π― Requirements
- 5+ years of Data Engineering experience with pipelines and infra.
- Strong SQL skills: joins, aggregations, unions, and windows.
- Data modeling and schema design for analytics.
- ETL pipelines with Airflow or similar orchestration tools.
- Big Data ecosystems: Hadoop, Hive, Spark, or related tech.
- Programming in Python, Java, Scala, or Go.
π Benefits
- 100% remote work from anywhere.
- Competitive USD pay.
- Paid time off.
- Autonomy to manage your time.
- Work with top American companies.
- Culture that values wellbeing and balance.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!