Related skills
docker etl sql python kubernetes📋 Description
- Design, develop, and deploy efficient data pipelines and ETL processes.
- Ensure data integrity with validation, schema checks, and error handling.
- Optimize data performance and resource usage for speed and cost-effectiveness.
- Write clean, maintainable code for data transformations between systems.
- Optimize pipelines for performance, security, and maintainability.
- Collaborate in code reviews and architectural decisions.
🎯 Requirements
- 3+ years in cloud-based data solutions development.
- Proficiency in Python, JavaScript/TypeScript, Go, or other modern languages.
- Advanced SQL, data modeling, and data transformation skills.
- OAuth, API keys, JWT, and API security best practices.
- Docker, Kubernetes, and CI/CD pipelines.
- Experience with cloud platforms (AWS, GCP, or Azure) and data warehousing.
🎁 Benefits
- Career Growth Opportunities
- Engaging Work Culture
- Top-Tier Compensation
- Equity Package
- Healthcare Coverage
- PTO
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!