Added
22 hours ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
sql python databricks scala airflowπ Description
- Build and maintain API adapters for Gemini trade data
- Design and optimize data pipelines for ledger operations
- In-person twice weekly at SF or NYC offices
- Automate ETL processes
- Optimize SQL queries and database performance
- Collaborate with vendors, Data Analytics, and Cryptocore engineers
π― Requirements
- 5+ years building reconciliation systems and data pipelines
- Real-time data solutions, data lake/warehousing knowhow
- Advanced SQL skills
- Databricks, Spark/PySpark, Kafka, Airflow experience
- Python and one of Scala, Rust, or Go
- Experience with MPP databases such as Redshift/Timescale/StarRock
π Benefits
- Competitive starting pay
- A discretionary annual bonus
- Long-term equity grant
- Comprehensive health plans
- 401K with company matching
- Paid Parental Leave
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!