Related skills
snowflake sql python dbt kafkaπ Description
- Architect and implement a next-gen data warehouse for scale.
- Build real-time data ingestion pipelines from large production databases.
- Develop real-time data transformations for live analytics.
- Define and execute a data roadmap with the Senior Data Product Manager.
- Lead technical direction, evaluate solutions, and drive best practices.
- Validate data integrity and measure system performance.
- Collaborate with stakeholders to build technical business cases.
π― Requirements
- 5+ years data engineering in SaaS.
- Real-time messaging with Kafka, Kinesis, or Pub/Sub.
- dbt, Flink, or Spark Streaming for transforms.
- Snowflake, Snowpipe, and CDC tasks.
- Postgres/MySQL CDC via Debezium.
- SQL and Python for data workflows.
- Data APIs to expose warehouse insights.
- Looker BI familiarity.
π Benefits
- Inclusive work environment
- Growth and development opportunities
- Global, customer-focused culture
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!