Related skills
bigquery redshift docker terraform snowflakeπ Description
- Transform and build data transfer solutions to/from our data warehouse.
- Design data exchange patterns with enterprise clients (push, cloud, and event-driven).
- Support partner onboarding and improve platform configurability.
- Collaborate as a team with defensible design and high collaboration.
- Champion test-first design; write tests to ensure coverage and reliability.
- Develop hardened CI/CD data models and pipelines for reporting and ML.
π― Requirements
- 6+ years in data engineering for internal and external customers.
- 4+ years designing end-to-end data pipelines in cloud (GCP/AWS/Azure).
- 4+ years Python experience writing efficient, tested code.
- 2+ years building streaming data ingestion pipelines.
- 1+ year ML support and/or MLOps.
- Advanced SQL with BigQuery, Snowflake, Redshift and performance tuning.
π Benefits
- Equity participation.
- Flexible PTO and parental leave.
- 100% covered medical, dental, and vision insurance.
- Lifestyle stipend to support wellbeing.
- Flexible remote/WFH policy.
- Dallas Deep Ellum office option for in-person work.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!