Related skills
bigquery redshift snowflake sql pythonπ Description
- Design, build, and maintain Rainβs core data pipelines from multiple sources
- Own orchestration and workflow management with Airflow, Dagster, or similar
- Architect and manage Rainβs data warehouse (Snowflake, BigQuery, Redshift)
- Develop ELT/ETL for logs, transactions, ledgers, and on-chain data
- Implement data quality frameworks and observability (tests, contracts, lineage)
- Partner with backend engineers to instrument events and telemetry
π― Requirements
- Data infrastructure builder β Own pipelines end-to-end in early-stage envs
- Expert data engineer β Strong Python/SQL; production-grade ETL/ELT
- Workflow & orchestration fluent β Airflow, Dagster, Prefect
- Warehouse & modeling savvy β Design schemas; Snowflake/BigQuery/Redshift
- Quality-obsessed β Data integrity, testing, lineage, observability
- Systems thinker β Data as a platform; reliability and scale
π Benefits
- Unlimited time off: at least 10 days
- Flexible working: remote or office; home environment stipend
- Health, dental, vision; life insurance
- 401(k) with 4% company match
- Equity option plan
- Team summits domestically and internationally
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!