Added
less than a minute ago
Type
Full time
Salary
Salary not provided

Related skills

bigquery python hadoop spark google cloud platform

πŸ“‹ Description

  • Data Engineer focusing on GCP and Python.
  • Work on Big Data tech like Spark and Hadoop.
  • Design scalable cloud infra on Google Cloud: Storage, BigQuery, Dataflow, Cloud Run.
  • Build robust data pipelines for banking/financial clients.
  • Collaborate on enterprise data transformation projects at Capco.

🎯 Requirements

  • 6-10 years of data engineering experience.
  • Proficient in Spark and Hadoop for big data.
  • Experience with Google Cloud Platform (Storage, BigQuery, Dataflow, Cloud Run).
  • Strong Python scripting and automation skills.
  • Experience delivering data solutions for banks/insurance clients.

🎁 Benefits

  • Tolerant, open culture valuing diversity.
  • No forced hierarchy; grow with Capco.
  • Commitment to diversity and inclusion.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’