Related skills
bigquery sql python gcp spark๐ Description
- Design, build, and maintain scalable data pipelines on GCP
- Develop and optimize batch and real-time data processing solutions
- Build and manage data lakes and data warehouses
- Develop and maintain datasets for analytics and reporting
- Improve data quality, reliability, and efficiency
- Create solution design and technical documentation
๐ฏ Requirements
- Proficient in Google Cloud Platform (GCP)
- BigQuery and Cloud Storage experience
- Dataflow/Dataproc and Pub/Sub expertise
- Python (Pandas, NumPy, PySpark) and SQL
- Spark (Batch & Streaming) and data modeling
- DevOps: CI/CD, Docker, Kubernetes, IaC
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!