Related skills
bigquery sql python gcp hadoop📋 Description
- Design scalable data architectures on GCP (BigQuery, Dataflow, Pub/Sub, Storage)
- Data modeling: conceptual, logical, physical models
- Big data processing with Spark and Hadoop
- Data governance: quality, security, compliance, metadata
- Build and optimize data pipelines (ingest, transform, load)
- Monitor and tune data systems for performance
- Collaborate with data engineers, scientists, and stakeholders
- Stay current with data architecture and cloud computing trends
🎯 Requirements
- GCP data services: BigQuery, Dataflow, Pub/Sub, Storage
- Data modeling expertise
- Spark and Hadoop experience
- Cloud architecture design and cost optimization
- Data governance: quality, security, compliance
- SQL, Python, DBT proficiency
- Problem-solving and communication skills
- Bachelor's degree in CS/CE/Data or equivalent
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!