Related skills
aws snowflake sql python tableau๐ Description
- Contribute to the Data Platform tech stack integrity and evolution.
- Design and implement core Data Platform features; translate requirements into production solutions.
- Execute and maintain DevOps workflows for the Data Platform, with monitoring and upgrades.
- Build robust ETL/ELT pipelines for Dimagi data, using SQL and Python.
- Design and develop data warehouse transformations using SQL and dbt.
- Collaborate with internal teams and external partners on enterprise data architectures.
๐ฏ Requirements
- 2-5 years of data engineering experience with scalable data systems.
- Experience delivering maintainable solutions with version control and tests.
- Hands-on with production pipelines using dbt, Airflow, Prefect, Fivetran, Talend.
- Cloud data platforms: AWS, Snowflake, plus ingestion/storage tech.
- Strong SQL; Python proficiency and data toolkits.
- Dimensional modeling concepts: star schemas; Kimball vs Inmon.
๐ Benefits
- 100% employer-sponsored medical with HRA
- Voluntary dental and vision plans
- 401K with up to 4% employer match
- Employee stock option plan
- 30 days paid time off
- Flexible work schedule
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!