Related skills
snowflake sql python dbt data modeling📋 Description
- Design and deliver data products for Fever’s partner ecosystem.
- Develop real-time/batch data APIs, dashboards, and exports.
- Build ETL/ELT workflows with Python, DBT, Airflow, and Snowflake.
- Ensure data accuracy and consistency with business context.
- Collaborate with PMs and stakeholders to translate needs into data structures.
- Own end-to-end delivery from idea to deployment in cross-functional teams.
🎯 Requirements
- Bachelor’s or Master’s in Computer Engineering, Data Engineering or related.
- Strong proficiency in Python, SQL, Airflow, and Snowflake.
- Deep understanding of data modeling and backend data exposure across products.
- Experience building data products for external consumption (APIs, dashboards).
- Excellent communication and collaboration with stakeholders.
- Fluent in Spanish and English; able to work with local and global teams.
🎁 Benefits
- 40% discount on Fever events and experiences.
- Stock options.
- Health insurance and Cobee flexible remuneration.
- English lessons.
- Gympass membership.
- Payflow salary advance option.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!