Related skills
aws snowflake python dbt airbyte๐ Description
- Collaborate with data scientists, ML engineers, and product teams on data solutions.
- Build and maintain data pipelines with dbt, Dagster, Snowflake, Estuary, and Airbyte.
- Enable deployments via software integration, supporting DevOps and CI/CD.
- Design and operate AWS data infrastructure for scale, performance, and cost efficiency.
- Create frameworks and self-service tooling to speed data delivery.
๐ฏ Requirements
- Bachelorโs degree or relevant experience.
- 5+ years as a software or data engineer in a fast-paced environment.
- Proficiency with Snowflake data warehouse and best practices.
- ETL, data modeling, version control (dbt, GitHub).
- Robust data models ensuring data integrity and clarity for internal/external users.
- Knowledge of AWS data ecosystem.
- Experience with data extraction tools Estuary and Airbyte.
- Python and JavaScript experience.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!