Related skills
bigquery python kubernetes gcp kafka๐ Description
- Design, build, and deploy scalable data pipelines from APIs, Webhooks, SFTP.
- Model and transform data to unify disparate sources into high-fidelity products.
- Modernize legacy data processes into scalable, stable architectures.
- Improve observability, automation, and AI-assisted tooling for faster delivery.
- Collaborate with product, engineering, support, and business teams.
- Investigate complex issues and root-cause analysis.
๐ฏ Requirements
- Experience architecting and deploying complex data pipelines.
- Pride in software from design to production monitoring.
- Python proficiency for large-scale data processing.
- Streaming tech (Kafka, Pub/Sub) and cloud DBs (BigQuery, Snowflake).
- Cloud platform experience (GCP, AWS, Azure).
- Fluent English with strong communication skills.
๐ Benefits
- Flexible time off: Autonomy to manage your work-life balance.
- Alan Flex benefits: 160โฌ/month for food or nursery.
- Flexible retribution: Payroll deductions for food, transportation and nursery.
- Wellbeing support: Subsidized ClassPass subscription.
- Comprehensive health insurance: Alan coverage for you and dependents.
- Impactful work: Shape products used by 85,000+ users worldwide.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!