Added
4 hours ago
Type
Full time
Salary
Salary not provided

Related skills

snowflake sql python kubernetes airflow

πŸ“‹ Description

  • Design, implement, and maintain data pipelines for on-chain data used by customers
  • Establish architecture for data quality and timely delivery with validation gates, SLAs, observability
  • Triage, analyze, and fix pipeline issues; drive root-cause analysis and reliability
  • Own end-to-end transformation pipelines with testing, CI, docs, and cost awareness
  • Build and orchestrate reliable pipelines using modern schedulers and control planes
  • Partner with product, GTM, and community to unlock value and faster insights

🎯 Requirements

  • Strong SQL skills and data modeling in modern warehouses; expertise in Trino, Snowflake, or ClickHouse
  • Proficiency in Python for pipeline development, tooling, and automation
  • Proven track record operating robust pipelines and orchestration with Prefect, Airflow, Elementary, or similar
  • Solid computer science fundamentals, system design, and experience in cloud and Kubernetes
  • Ability to analyze, debug, and resolve data pipeline issues independently in a remote async environment
  • Strong blockchain data intuition, including reading transactions, events, and traces

🎁 Benefits

  • Competitive salary and equity package; top 25% in the space
  • Equity with discounted strike price (~90%) and 10-year window
  • 5 weeks PTO plus local public holidays
  • Fully remote-first with flexible hours
  • Healthy async and sync work balance
  • Private medical, dental, and vision insurance
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’