Related skills
aws postgresql sql mysql python๐ Description
- Develop and maintain end-to-end data pipelines.
- Integrate data from CRM, product, marketing, orders, and support.
- Improve data architecture, quality, monitoring, and availability.
- Write data transformations in SQL and Python for data products.
- Collaborate with cross-functional teams to meet data needs.
- Champion Samsaraโs growth and customer-focused principles.
๐ฏ Requirements
- Bachelor's degree in CS or related field.
- 3+ years data engineering / ETL experience.
- 3+ years building large-scale production data pipelines.
- Experience with cloud data lakes/warehouses and ETL/ELT tools.
- Proficiency in Python and SQL.
- Experience with Fivetran / dbt (dbt Cloud).
- Cloud: AWS/Azure/GCP and data warehouses (BigQuery/Redshift/Snowflake).
๐ Benefits
- Remote and flexible working options.
- Health benefits.
- Competitive total compensation with RSUs.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!