Related skills
bigquery java snowflake python kotlinπ Description
- Design, build, and operate self-serve ETL/ELT for batch & streaming.
- Own pipeline reliability, data quality SLAs, and schema evolution.
- Drive data platform roadmap; define patterns and standards for eng teams.
- Partner with Product Eng, Data Science, Analytics, and Marketing on data contracts.
- Architect data lineage, schema registry, access controls for compliance.
- Own platform monitoring, alerting, debugging, and on-call incidents.
π― Requirements
- 5+ years building and operating production data infrastructure.
- Strong system design; written design docs and trade-offs.
- Python or Java/Kotlin; emphasis on testing and maintainability.
- Cloud data warehouse (Snowflake/BigQuery/Redshift) + orchestration (Airflow/Dagster/Prefect) with Terraform.
- Streaming tech (Kafka/Flink/Kinesis) and dbt data modeling.
- Observability for data systems; governance in regulated environments.
π Benefits
- In-office policy with four days in office; Fridays remote for nearby offices.
- Backup child/elder/pet care plus subsidized commuter benefit.
- Competitive salary based on experience.
- 401k match plus medical, dental, vision, life, and disability benefits.
- Generous vacation policy and company-wide days off.
- Parental leave and Maven fertility/adoption support.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!