Related skills
aws snowflake sql python kafka๐ Description
- Design, build, and maintain scalable data pipelines with AWS, Airflow, dbt Cloud
- Implement Medallion Architecture (Bronze/Silver/Gold) for data layers
- Ingest data with AWS: S3, Lambda, Glue, SNS/SQS, Kinesis
- Develop ELT pipelines using dbt Cloud and orchestrate with Airflow
- Ensure data quality, lineage, observability and governance (PII, GDPR)
- Partner with Product, Analytics, Risk, Compliance to deliver trusted data
๐ฏ Requirements
- 5+ years of hands-on data engineering, banking/financial services domain
- Snowflake, dbt Cloud, Apache Airflow, and AWS data services
- Strong SQL and Python; Medallion/Lakehouse architecture
- Experience with batch and near-real-time ingestion
- Git, CI/CD, and Infrastructure as Code exposure
- Banking domain experience: transactions, payments, accounts, regulatory reporting
๐ Benefits
- Competitive salary
- Annual bonus opportunity
- Great career growth and development opportunities
- Flexible approach to work
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!