Related skills
docker terraform github actions python ci/cd๐ Description
- Design, build, and maintain scalable data pipelines and SDKs.
- Develop data governance tooling and observability frameworks.
- Create reusable libraries and pipeline templates for business teams.
- Automate data workflows and empower business teams.
- Contribute to enterprise-wide data governance across the organization.
๐ฏ Requirements
- 3โ5 years of experience in Data Engineering, DevOps, or Software Eng.
- Hands-on with Apache Spark and/or Kafka.
- Strong Python for building data pipelines, SDKs, automation.
- DevOps with CI/CD and Docker.
- Experience with IaC tools, especially Terraform.
- English proficiency; nice to have Airflow, GitHub Actions, AWS/GCP.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!