Related skills
etl sql nosql python gcp๐ Description
- Design and implement ETL pipelines in GCP
- Build Python-based data pipelines for ML
- Prepare and transform data for Data Science teams
- Create data ingestion processes to generate model predictions
- Leverage cloud tools (GCP, Azure, Databricks) for data storage/pipelines
- Collaborate on version control and CI/CD workflows
๐ฏ Requirements
- 3+ years SQL (ETL, complex DB scripting)
- 3+ years Python (ETLs, ML pipelines)
- 2+ years GCP
- Tagging datasets with GCP data annotation tools
- Data prep for DS teams: cleaning, integration
- GIT, CI/CD, Spark/Hadoop, NoSQL (Mongo, Cosmos)
๐ Benefits
- Professional growth
- Dynamic work environment
- Attractive benefits plan
- Comprehensive benefits and leadership support
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!