Related skills
redshift looker aws python dbt๐ Description
- Design & Architect: Author design docs for scalable infrastructure and tooling.
- Tooling & SDK Development: Build CLI tools, web apps, and Python SDKs using Ray.
- Cloud Infrastructure: AWS-based infra for high-throughput data processing.
- Data Pipeline Engineering: Build pipelines with dbt, Iceberg/Parquet, Redshift; surface data to Looker.
- CI/CD & Containerization: Drive CI/CD with GitLab; deploy services with Docker/Kubernetes.
- Mentorship & Leadership: Mentor peers and help lead the metrics infra roadmap.
๐ฏ Requirements
- 5+ years of software engineering with backend/data/infrastructure focus.
- Python expertise for scalable backends, SDKs, and CLI tools.
- Experience with AWS, Redshift, Iceberg, and Parquet.
- DBT for data transformations and Looker BI.
- Ray for scaling Python workloads.
- Docker/Kubernetes; GitLab CI/CD; design docs & architecture reviews.
๐ Benefits
- Medical, dental, vision coverage.
- 401(k) with company match; HSA.
- Life and pet insurance.
- Hybrid in-office options in Boston/Pittsburgh/Las Vegas or fully remote.
- Comprehensive benefits package.
- Other benefits as part of Motional benefits program.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!