Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
github aws snowflake sql s3π Description
- Develop and scale standardized Python scripts to identify data mismatches and logic gaps across the application intake and decisioning pipeline.
- Leverage Cloud-based Data Lakes (S3, Snowflake) as primary data source, performing complex data reconciliations to ensure accurate financial and regulatory reporting.
- Drive best practices in version control using GitHub, ensuring all control scripts are peer-reviewed, modular, and production-ready.
- Build Gherkin-based automated tests to ensure that technical implementations align with business intent and regulatory requirements.
- Integrate control outputs with enterprise monitoring tools (New Relic or Splunk) for real-time visibility and automate incident triggers via standard alerting platforms.
- Work closely with Data Analysts and Product Managers to translate complex business requirements into robust, reusable code templates that satisfy audit and risk standards.
π― Requirements
- Bachelor's Degree in Computer Science, Engineering, or a related technical field.
- 5+ years of Software Engineering with strong Python experience.
- 3+ years SQL with relational or NoSQL databases; 2+ years with AWS.
- 2+ years of Git and version control workflows (PR reviews, branch protection).
- Knowledge of Gherkin-based automated testing and CI/CD (Jenkins, GitHub Actions).
- Experience with Apache Kafka for real-time data pipelines (preferred).
π Benefits
- Competitive compensation package
- Professional development opportunities
- Flexible work arrangements
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!