Related skills
bigquery snowflake python databricks kafka๐ Description
- Embed with strategic customers to understand data needs and goals.
- Build production data pipelines, connectors, and workflows across warehouses and APIs.
- Design and deploy AI-powered data flows with Nexla connectors and transformations.
- Own end-to-end delivery: scope, build, test, deploy, run.
- Navigate enterprise data environments with messy schemas, undocumented APIs, and regulatory constraints.
- Influence product by surfacing patterns and gaps to CTO/Product/Eng.
๐ฏ Requirements
- 1+ years software engineering experience, with production systems.
- Experience deploying agentic AI systems in enterprise environments.
- Strong Python skills; building data pipelines, integrations, and APIs.
- Knowledge of Snowflake, Databricks, BigQuery, Kafka, ETL/ELT, and the modern data stack.
- Hands-on AI/LLM tech: prompt engineering, agent development, orchestration.
- Customer-facing confidence; run discovery calls and explain trade-offs.
๐ Benefits
- Hybrid work with travel to customer sites.
- Medical, Dental, Vision benefits.
- 401k retirement plan.
- Flexible PTO.
- Opportunity to influence product and own outcomes.
- Collaborative, data-driven culture.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!