Related skills
aws sql python dbt airflow๐ Description
- End-to-end ownership of data warehousing infrastructure, its KPIs and SLAs.
- Design, build, and optimize large-scale, high-performance data pipelines.
- Architect and evolve Sezzle's data ecosystem for reliability, scalability, and efficiency.
- Lead ETL/ELT workflows with Redshift, DBT, AWS DMS, and related tooling.
- Partner with cross-functional teams to gather requirements and deliver robust datasets.
- Evaluate new technologies to evolve Sezzle's data stack and infrastructure.
๐ฏ Requirements
- 12+ years in data engineering with a track record of scalable production systems.
- Deep expertise with AWS Redshift or similar products, incl. performance tuning.
- Hands-on ETL/ELT experience with DBT, AWS DMS, and related tools.
- Proficiency in SQL (advanced) and Python, Scala, or Java.
- Experience building AWS-based data platforms (S3, Lambda, Glue, EMR).
- Track record designing scalable, fault-tolerant pipelines using Airflow, Dagster, or Prefect handling 100GB-1TB/day.
๐ Benefits
- Unlimited PTO, volunteer hours, and sabbatical
- Health insurance: Life, STD/LTD, medical, dental, and vision
- Highly discounted Lifetime gym membership
- 401k with match
- Collaborative, fun co-workers
- The opportunity to join the fastest growing FinTech alongside a driven team
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!