Related skills
azure terraform aws etl gcpπ Description
- Manage the discovery-to-scale data pipeline and deliver validated solutions.
- Bridge building technical products with rapid solution delivery.
- Design data integration, storage, and infrastructure for analytics and AI.
π― Requirements
- Bachelor's degree in CS or Engineering; 8β12 years leading data projects.
- Enterprise data integration & architecture: design and maintain robust ETL/ELT pipelines.
- Cloud-native data ecosystems: AWS, Azure, or GCP; IaC and CI/CD for data solutions.
- Advanced data modeling: conceptual, logical, physical modeling; manage migrations.
- AI/MLOps readiness: apply AI tools to optimize data workflows and govern models.
- System reliability & resilience (SRE): define/monitor SLOs/SLIs; ensure resilient data systems.
π Benefits
- Challenging and rewarding work with real impact
- Direct Access to Cutting-Edge AI Platforms
- Diverse and Inclusive Culture
- Growth opportunities for personal and professional development
- Hybrid working model
- Exposure to exciting projects and high-profile clients
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!