Related skills
redshift snowflake sql python kafka๐ Description
- Design, build, and maintain robust data pipelines and ML systems for production environments.
- Develop and deploy ML and LLM-based solutions addressing real client business challenges.
- Build and maintain ETL/ELT workflows using modern orchestration and distributed computing tools.
- Implement MLOps practices: CI/CD, automated testing, model monitoring, and experiment tracking.
- Architect and implement cloud-native data and AI/ML solutions, primarily on AWS.
- Collaborate closely with Data Scientists, AI/ML Engineers, Backend Engineers, and client stakeholders.
๐ฏ Requirements
- 6+ years hands-on engineering experience with production systems.
- Full-stack mindset across AI, backend, data, and cloud infra.
- Autonomous, proactive work style.
- Experience adopting AI tools in day-to-day workflows (e.g., Claude Code, Copilot).
- Strong ownership, proactive problem solving; open to learning adjacent areas.
- B2+ English; comfortable collaborating across distributed teams.
๐ Benefits
- Impactful work: GenAI, MLOps, and NextGen data platforms for global enterprises.
- Senior-calibre peers across North America, LATAM, and EMEA.
- Career growth toward Tech Lead; engineers actively developed.
- Recognised AWS Premier Consulting Partner featured in Forresterโs AI Technical Services Landscape.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!