Related skills
aws snowflake python databricks graphqlπ Description
- Contribute to a high-performing data engineering team with scalable data solutions.
- Partner with the CDO to implement an AWS data platform enabling self-service AI/analytics.
- Implement ingestion, transformation, and distribution frameworks for faster insights.
- Champion AI adoption, integrating capabilities into data workflows.
- Ensure reliability with robust quality controls and incident prevention.
π― Requirements
- 3-5 years hands-on data-centric software engineering experience in financial services.
- Strong AWS expertise (S3, EKS, Lambda, Glue, Redshift, etc.) and data platforms (Databricks, Snowflake).
- Experience with data catalogs and MDM (Collibra, Informatica) and data governance.
- Python for data engineering (Pandas, Spark) and API development (FastAPI).
- REST/GraphQL API design, low-latency reads; ETL/ELT, Kafka/MSK.
- CI/CD and IaC with GitHub workflows and ArgoCD; security and reliability (IAM, KMS, VPC, observability).
- Strong collaboration with stakeholders to solve complex data needs.
π Benefits
- Work-Life Balance: 40% in-office monthly; flexible.
- Culture of Learning & Mobility: trainings, leadership development, mentorship.
- Retirement planning and tuition reimbursement programs.
- Comprehensive healthcare and well-being offerings.
- Family-friendly policies, including parental leave.
- Inclusive environment with Employee Resource Groups and global collaboration.
- Paid volunteer days and matched donations.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!