Added
7 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

bigquery redshift docker terraform snowflake

πŸ“‹ Description

  • Transform and build data transfer solutions to/from our data warehouse.
  • Design data exchange patterns with enterprise clients (push, cloud, and event-driven).
  • Support partner onboarding and improve platform configurability.
  • Collaborate as a team with defensible design and high collaboration.
  • Champion test-first design; write tests to ensure coverage and reliability.
  • Develop hardened CI/CD data models and pipelines for reporting and ML.

🎯 Requirements

  • 6+ years in data engineering for internal and external customers.
  • 4+ years designing end-to-end data pipelines in cloud (GCP/AWS/Azure).
  • 4+ years Python experience writing efficient, tested code.
  • 2+ years building streaming data ingestion pipelines.
  • 1+ year ML support and/or MLOps.
  • Advanced SQL with BigQuery, Snowflake, Redshift and performance tuning.

🎁 Benefits

  • Equity participation.
  • Flexible PTO and parental leave.
  • 100% covered medical, dental, and vision insurance.
  • Lifestyle stipend to support wellbeing.
  • Flexible remote/WFH policy.
  • Dallas Deep Ellum office option for in-person work.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’