Related skills
aws snowflake sql python rest apis📋 Description
- Build and manage integrations across systems, customers, SaaS, and data stores via APIs.
- Analyze customer workflows to ingest, unify, and onboard data to TDX.
- Implement TDX APIs with customer and partner solutions.
- Support taxonomy, data processes, models, and mappings for data quality.
- Optimize data pipelines for performance, scalability, resiliency, and quality.
- Ensure data integrity, security, and governance with validations and controls.
- Collaborate with Product, Data Science, Analytics to translate requirements into solutions.
- Contribute to continuous improvement of Brightfield’s data tooling and practices.
🎯 Requirements
- 5+ years in data engineering, data integration, or backend with production pipelines.
- Bachelor’s degree in CS/IS or related field, or equivalent.
- Expert Python and SQL with large, complex datasets.
- Hands-on REST APIs with JSON data formats.
- Strong AWS and cloud data warehouses like Snowflake.
- Data modeling, ETL/ELT, and data quality practices.
- Troubleshoot data and integration issues across distributed systems.
- Remote-first, distributed teams; comfortable working style.
🎁 Benefits
- Fully remote team with flexible work culture.
- Ownership, continuous learning and collaboration.
- Impact work on data used by Global 2000 companies.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!