Related skills
aws snowflake sql python databricks๐ Description
- Independently own data features and small projects end-to-end.
- Develop, operate, and maintain components of the data ecosystem.
- Improve code quality, reduce tech debt, modernize the stack.
- Write reliable, performant, scalable code with tests.
- Monitor deployed code; fix bugs; ensure stability.
- Develop and optimize DAGs with Airflow 2.0 and Astronomer.
๐ฏ Requirements
- 5+ years in data engineering with cloud tech (AWS, Databricks, Snowflake).
- Proficiency in Python and SQL.
- Experience with Spark, Trino, Hive, and cloud storage.
- Apply software development practices to data, incl. testing and CI/CD.
- Understand analytic/data needs in corporate functions.
- Build frameworks and APIs for automated data management and governance.
๐ Benefits
- Professional and stable working environment.
- Latest technology and equipment.
- Remote work potential, including out of country (subject to authorizations).
- 28 calendar days vacation and up to 5 paid sick days.
- 18 weeks of paid parental leave.
- Mental health benefits.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!