Related skills
sql python airflow data pipelines pysparkπ Description
- Deploy and manage AI evaluation across infrastructure
- Analyze data to identify trends and model improvements
- Build and maintain scalable data pipelines
- Collaborate with security, engineering, and product teams
- Automate workflows to boost operational efficiency
π― Requirements
- Proficient in Python; API calls with requests; JSON handling
- PySpark and SQL (Hive) data processing in distributed envs
- Solid software dev fundamentals; Git version control
- Strong problem-solving and critical thinking
- Eager to learn; good communication; ownership mindset
π Benefits
- Competitive salary and benefits
- Remote work arrangement
- Early career program with networking and development
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!