Related skills
bigquery python hadoop spark google cloud platformπ Description
- Data Engineer focusing on GCP and Python.
- Work on Big Data tech like Spark and Hadoop.
- Design scalable cloud infra on Google Cloud: Storage, BigQuery, Dataflow, Cloud Run.
- Build robust data pipelines for banking/financial clients.
- Collaborate on enterprise data transformation projects at Capco.
π― Requirements
- 6-10 years of data engineering experience.
- Proficient in Spark and Hadoop for big data.
- Experience with Google Cloud Platform (Storage, BigQuery, Dataflow, Cloud Run).
- Strong Python scripting and automation skills.
- Experience delivering data solutions for banks/insurance clients.
π Benefits
- Tolerant, open culture valuing diversity.
- No forced hierarchy; grow with Capco.
- Commitment to diversity and inclusion.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!