Related skills
bigquery sql python dbt kafkaπ Description
- Design, architect, and maintain core marketplace datasets, data marts, and feature stores.
- Collaborate with analytics, data science, and ML teams to integrate data into the SDLC.
- Collaborate with product engineers, analysts, data scientists, and ML engineers to design datasets.
- Drive data quality and best practices across business areas.
- Build next-gen data products on real-time data atop Apache Kafka.
π― Requirements
- 4 or more years of experience designing and building data sets and warehouses.
- Strong ability to partner with analytics and other teams for process improvements.
- Expertise in SQL for analytics and ETL transforms in SQL and Python.
- Experience building data warehouses and data marts from production and clickstream data.
- Experience with cloud-native stacks (BigQuery, dbt, Apache Airflow) or similar.
- Experience or interest in AI-enabled workflows to speed development velocity.
- Strong ownership from ideation to delivery.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!