Related skills
elixir sql python databricks kafkaπ Description
- Design, build, and maintain event streaming pipelines ingesting data from client systems and services.
- Develop and operate analytical databases and data models for high-volume event data.
- Write production Elixir and Python services for event processing, transformation, and routing.
- Integrate legacy pipelines with modern streaming infra, designing migration paths to minimize risk.
- Build monitoring, alerting, and observability tooling for data pipelines.
- Define and enforce event schemas, data contracts, and quality standards with producing/consuming teams.
π― Requirements
- 6+ years of professional experience in data or backend engineering with a focus on event-driven systems.
- Proficiency in Elixir and/or Python for building application connectors and data pipelines.
- Advanced SQL skills for data modeling, query optimization, and analytics.
- Hands-on experience with columnar/OLAP databases at production scale.
- Experience with stream processing frameworks and brokers such as Flink, Kafka, Pulsar, or Kinesis.
- Experience designing data systems on AWS; GCP experience is a plus.
π Benefits
- Ground-floor opportunity to shape the architecture of a critical data domain.
- Commitment to diversity and an inclusive, equal-opportunity culture.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!