Related skills
bigquery mongodb sql python oracleπ Description
- Design and build large-scale distributed data infrastructure for analytics.
- Architect and maintain real-time data pipelines with Kafka, Debezium, and connectors.
- Enable cloud data warehousing and database optimization.
- Implement CDC pipelines ensuring data integrity and fault tolerance.
- Collaborate with teams to translate business needs into data solutions.
- Govern data quality, lineage, and security across domains.
π― Requirements
- Expertise in Apache Kafka, Debezium, and Confluent connectors.
- Experience with CDC-based streaming pipelines and reliability.
- Proficiency with ETL/ELT using Dataiku, Python, and SQL.
- Hands-on with Oracle, MongoDB, and BigQuery.
- Strong data modeling, SQL analytics, and performance tuning.
- Knowledge of data governance, metadata, and security compliance.
π Benefits
- Inclusive, diverse culture with flexible remote, in-office, and hybrid options.
- Competitive compensation and potential equity.
- Annual learning stipend and ongoing training.
- Join a team of 30+ nationalities across 7 countries.
- Autonomy, mentorship, and challenging goals.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!