Related skills
redshift aws sql python airflow📋 Description
- Design, build, and maintain large-scale ETL/ELT pipelines for market data
- Modernize legacy ETL frameworks into cloud-native AWS pipelines
- Build and manage data lakes and analytics-ready datasets using AWS-native services
- Clean, standardize, and govern financial instrument identifiers, mappings
- Design canonical financial data models (facts, dimensions, mappings)
- Implement data quality checks, lineage, observability, and validation frameworks
🎯 Requirements
- Bachelor’s Degree in CS, Engineering, Math, or equivalent
- 4–7 years in data engineering, fintech/trading, or market data
- Strong Python and SQL skills
- Experience with batch and/or streaming ETL pipelines
- Experience with AWS cloud services (S3, Glue, Lambda, Step Functions, Athena/Redshift, CloudWatch, IAM)
- Familiarity with Airflow or similar
🎁 Benefits
- Competitive compensation packages and Comprehensive benefits
- Flexible working arrangements
- collaborative, community-first culture
- Strong Focus on health, wellness, and work-life balance
- Opportunities to make real impact in fast-scaling fintech platform
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!