Related skills
redshift aws sql s3 python๐ Description
- Design and build scalable data pipelines and infra
- Collaborate with analytics, product, and engineering teams
- Develop and optimize ETL/ELT workflows in SQL and Python
- Schedule, monitor, and troubleshoot Airflow workflows
- Work with Spark, Trino, and AWS data services
- Ensure data quality and governance across datasets
๐ฏ Requirements
- 1โ3 years of data engineering/analytics experience
- Experience building data pipelines in cloud environments
- Proficiency in SQL
- Python for data processing
- Familiarity with Apache Airflow
- Exposure to AWS services (S3, Redshift, Glue, Athena)
๐ Benefits
- Generous health coverage for you and your family, including disability and life insurance
- 401(k) match after 60 days, 100% vested after 1 year
- HSA contributions and an HRA to offset your deductible
- Flexible vacation policy
- Be More โ rewards and recognition program
- Volunteer time off and mental health time
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!