Added
6 days ago
Type
Full time
Salary
Salary not provided

Related skills

azure terraform aws etl gcp

πŸ“‹ Description

  • Manage the discovery-to-scale data pipeline and deliver validated solutions.
  • Bridge building technical products with rapid solution delivery.
  • Design data integration, storage, and infrastructure for analytics and AI.

🎯 Requirements

  • Bachelor's degree in CS or Engineering; 8–12 years leading data projects.
  • Enterprise data integration & architecture: design and maintain robust ETL/ELT pipelines.
  • Cloud-native data ecosystems: AWS, Azure, or GCP; IaC and CI/CD for data solutions.
  • Advanced data modeling: conceptual, logical, physical modeling; manage migrations.
  • AI/MLOps readiness: apply AI tools to optimize data workflows and govern models.
  • System reliability & resilience (SRE): define/monitor SLOs/SLIs; ensure resilient data systems.

🎁 Benefits

  • Challenging and rewarding work with real impact
  • Direct Access to Cutting-Edge AI Platforms
  • Diverse and Inclusive Culture
  • Growth opportunities for personal and professional development
  • Hybrid working model
  • Exposure to exciting projects and high-profile clients
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’