Related skills
sql python airflow git pysparkπ Description
- Deploy and manage AI evaluation processes for performance monitoring.
- Analyze data to identify trends and model enhancement opportunities.
- Build and maintain scalable data pipelines feeding downstream systems.
- Collaborate with security, engineering, and product teams on security/compliance.
- Automate workflows to improve operational efficiency.
π― Requirements
- Proficient in Python; experience with API calls (requests) and JSON.
- Strong data processing in PySpark and SQL (Hive) for large workloads.
- Solid software development fundamentals, including Git.
- Excellent problem-solving and logical thinking; strong communication.
- Eagerness to learn; ability to own tasks from conception to completion.
π Benefits
- Shape the future with a leading blockchain ecosystem.
- Collaborate with world-class talent in a global org.
- Tackle fast-paced projects with autonomy and innovation.
- Career growth and continuous learning opportunities.
- Competitive salary and comprehensive benefits.
- Work-from-home arrangement may vary by team.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!