Related skills
redshift looker aws python dbtπ Description
- Design and architect scalable infrastructure and tooling.
- Build internal CLI tools, web apps, and Python SDKs using Ray.
- Architect and maintain AWS cloud infrastructure for high-throughput data.
- Develop data pipelines with dbt, Iceberg/Parquet; surface data to Looker.
- Drive CI/CD and containerization using GitLab and Docker/Kubernetes.
- Mentor peers and lead the metrics infrastructure roadmap.
π― Requirements
- 5+ years of software engineering in backend/data infra or platform.
- Deep Python for scalable backends, SDKs, and CLIs.
- AWS, Redshift, Iceberg, and Parquet experience.
- DBT for data transformations; Looker BI experience.
- Ray for scaling Python workloads.
- Docker/Kubernetes and GitLab CI/CD; IoT telemetry bonus.
π Benefits
- Hybrid in-office time with Boston, Pittsburgh, or Las Vegas; remote option.
- Medical, dental, vision, 401k with company match, and HSA.
- Life insurance and pet insurance included.
- Diversity-focused, inclusive culture.
- Pay transparency; details shared during hiring.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!