Related skills
sql python pandas dbt apache airflowπ Description
- Build scalable data infrastructure and ETL pipelines migrating from transactional databases to analytical data warehouses
- Create real-time data ingestion systems processing campaign data, user metrics, and BI
- Build multi-tenant data models with partitioning for enterprise-scale clients
- Develop data quality frameworks with validation, monitoring, and alerting
- Architect multi-tenant security with RLS/RBAC and audit trails
π― Requirements
- 5+ years of data engineering experience with production-scale systems
- Expert-level SQL skills with analytical databases (columnar preferred)
- Strong Python programming with data libraries: pandas, numpy, pyarrow
- Experience with ETL orchestration tools: Apache Airflow, Prefect, dbt
- Deep understanding of analytical databases, partitioning strategies, and OLAP optimization
- Knowledge of Row-Level Security (RLS) and RBAC for multi-tenant data
π Benefits
- Learning platform and career guidance from our training resources
- Wellness stipend and meal support for remote teams
- Generous vacation, floating holidays, and parental leave
- Medical, life, and business travel insurance coverage
- Stock options as part of our equity-sharing program
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!