Related skills
aws etl sql nosql s3๐ Description
- Develop, maintain, and enhance ETL pipelines.
- Write efficient, testable, scalable, and secure client code.
- Gather requirements, validate understanding, and document.
- Execute activities per current methodology with high quality.
- Collaborate with engineers, designers, and managers to solve user pain points.
- Take ownership from design to launch.
๐ฏ Requirements
- Minimum 5 years of experience in data engineering or a related field.
- Strong proficiency in Python programming language.
- Deep understanding of AWS services (Kinesis, S3, Athena, Redshift, DynamoDB, Lambda).
- Experience with data ingestion pipelines, ETL processes, and data warehousing concepts.
- Proficiency in SQL and NoSQL databases.
- Experience with data modeling and schema design.
๐ Benefits
- Flexible work environment โ remote-first approach.
- Global team, English-speaking across cultures.
- Well-being focus with fitness offerings, mental health plans, or generous time off.
- Professional services model enabling career growth.
- Equal opportunity employer with diversity commitment.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!