Related skills
redshift looker aws python dbt๐ Description
- Design & Architect scalable infrastructure and tooling.
- Tooling & SDK development: internal CLI tools, web apps, Python SDKs with Ray.
- Cloud infrastructure on AWS to support high-throughput data processing.
- Data pipeline engineering: dbt, Iceberg/Parquet, Redshift for BI Looker.
- CI/CD & containerization: GitLab CI/CD; Docker + Kubernetes.
- Mentorship & leadership: establish standards and guide the metrics roadmap.
๐ฏ Requirements
- 5+ years of software engineering, backend/data infra, or platform engineering.
- Deep Python expertise for scalable backend, SDKs, and CLI tools.
- Experience with AWS, Redshift, Iceberg, Parquet.
- DBT experience for data transformations; Looker BI integrations.
- Hands-on Ray experience for scaling Python workloads.
- Docker/Kubernetes; CI/CD pipelines via GitLab.
- Proven ability to write technical design documents and lead architecture reviews.
๐ Benefits
- Medical, dental, and vision coverage
- 401(k) with company match
- Health Savings Account (HSA)
- Life insurance
- Pet insurance
- Hybrid work schedule with in-office options or fully remote
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!