Related skills
aws snowflake dbt airflow data pipelines📋 Description
- Lead and grow a high-performing data engineering team
- Assess current data platform and lead re-architecture where needed
- Own platform reliability, observability, and data quality standards
- Design scalable ingestion and schema normalization to support rapid acquisitions
- Lead migration of acquired data pipelines onto Akido’s infrastructure (Snowflake, Airflow, DBT, AWS)
- Promote AI-assisted development as a team-wide engineering standard
🎯 Requirements
- 8+ years of software or data engineering experience
- 3+ years of engineering leadership experience
- Deep experience with Snowflake, Airflow, DBT, AWS
- Experience designing scalable ingestion architectures and schema normalization strategies
- Experience migrating legacy or acquired data pipelines to modern infrastructure
- Strong architectural judgment and production ownership experience
🎁 Benefits
- Stock-options package
- Health benefits include medical, dental and vision
- 401K
- Long-term disability
- Unlimited PTO
- Life insurance
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!