Related skills
azure java aws sql pythonπ Description
- Design and implement scalable data pipelines ingesting data from TMS, WMS, ERP
- Optimize performance and scalability for large logistics data volumes
- Build data models, data marts, and data warehouses/lakes
- Translate business requirements into technical data solutions
- Implement data quality, lineage, and observability practices
- Take ownership of production support
π― Requirements
- 5+ years of experience as a Data Engineer or similar backend role
- Proven experience building production-grade ETL/ELT pipelines
- Strong SQL and TSQL with Python or Java
- Cloud platforms: Azure, AWS, or GCP
- Strong data modeling, architecture, and governance knowledge
- Excellent problem-solving and independent work in fast-paced env
π Benefits
- Flexible hours and remote-first mode
- Competitive compensation
- Complete hardware/software setup for work
- Open-door culture with transparent communication
- Health insurance, vacation, sick leave, holidays, paid parental leave
- Access to learning and development center with workshops and training
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!