Related skills
redshift aws sql s3 pythonπ Description
- Design, build, and operate robust, fault-tolerant data pipelines with Airflow.
- Architect and enhance data ingestion and AWS infra (Redshift, S3).
- Partner with Analytics, Product, Finance, Marketing to design analytics-ready datasets.
- Define and enforce SLAs, data quality, observability, and alerting.
- Contribute to design reviews and share engineering best practices.
- Evaluate data tooling to improve reliability, productivity, and analytics velocity.
π― Requirements
- 7+ years of experience in Data Engineering with owning production data platforms.
- Deep expertise designing and operating Airflow, AWS data platforms (Redshift, S3, IAM) in production.
- Advanced proficiency in SQL and Python with strong data modeling and systems design.
- Strong understanding of distributed processing, ELT architectures, analytics workflows, BI use cases, and stakeholder collaboration.
- AI tools knowledge including prompt engineering and modern AI stacks.
- Strong communication, ability to lead technical discussions, influence architecture, and drive initiatives across teams.
π Benefits
- Equity and comprehensive benefits (medical, dental, vision, flexible PTO, and more).
- Offices in San Francisco (HQ), Waterloo, and a studio in Los Angeles.
- Equal opportunity employer; accommodations for qualified individuals with disabilities.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!