Related skills
bigquery aws sql nosql python๐ Description
- Build and maintain robust data pipelines; ensure data quality.
- Use Google Data Products (BigQuery, Dataflow, Composer) and AWS data services.
- Manage metadata, data quality, and data lineage across cloud platforms.
- Lead end-to-end data engineering lifecycle; handle non-functional requirements.
- Design E2E solutions with stakeholders; prototyping and data visualization.
- Develop and optimize SQL/NoSQL storage; Python data manipulation; ETL.
๐ฏ Requirements
- Python: 3-4 years with semi/unstructured data and REST API integration.
- Relational Databases: 3-4 years with BigQuery, AWS Athena/Redshift, or MySQL.
- ETL Tools: 4-5 years with Airflow/Composer, AWS Glue or Informatica; data modeling.
- Google Cloud: 2 years with BigQuery, Dataflow, Cloud Composer.
- AWS: 2 years with S3, Glue, Athena, Redshift, API Gateway.
- Data Eng lifecycle: design, development, operations; Agile/Jira.
๐ Benefits
- Winning culture with growth and development opportunities.
- DEIB commitment and inclusive, respectful workplace.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!