Related skills
sql python pandas dbt airflowπ Description
- Build scalable data infrastructure for multi-tenant analytics
- Design ETL pipelines migrating data from OLTP to warehouses
- Create real-time data ingestion for campaigns and metrics
- Build multi-tenant data models with partitioning for enterprise-scale clients
- Develop data quality frameworks with validation and monitoring
π― Requirements
- 5+ years of data engineering experience with production-scale systems
- Expert-level SQL with analytical/columnar databases
- Strong Python with pandas, numpy, pyarrow
- Experience with ETL orchestration tools Airflow, Prefect, dbt
- Deep understanding of analytical databases, partitioning, OLAP
- Experience building SaaS data platforms with tenant isolation
π Benefits
- People: talented, collaborative, friendly teammates
- Guidance: learning platform to help you succeed from day one
- Surprise meal stipends for remote work
- Work-life harmony: 26 days vacation, holidays, wellness allowance
- Medical, life, and business travel insurance
- Stock options as part of our equity-sharing program
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!