Related skills
etl sql python data modeling data lakes📋 Description
- Design, implement, and maintain scalable data transformation layers and orchestration frameworks.
- Design and build robust pipelines to ingest data from APIs, logs, and relational DBs.
- Ensure reliable, timely execution of ETL/ELT pipelines to maintain data integrity.
- Standardize analytics workflows with version control, CI/CD, and automated data validation.
- Develop a semantic layer to enable self-service analytics for stakeholders.
- Monitor cloud compute usage and data model performance for low-latency reporting.
🎯 Requirements
- 3+ years of experience in data engineering or analytics engineering
- Bachelor's degree in quantitative/technical fields or 5+ years as Data Engineer
- Expert-level mastery of SQL for high-volume queries
- Strong command of at least one major data processing language
- Hands-on experience with data lakes or cloud-based data warehouses
- Experience with ETL/ELT patterns and data integration
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!