Added
7 days ago
Type
Full time
Salary
Salary not provided

Related skills

sql python airflow git pyspark

πŸ“‹ Description

  • Deploy and manage AI evaluation processes for performance monitoring.
  • Analyze data to identify trends and model enhancement opportunities.
  • Build and maintain scalable data pipelines feeding downstream systems.
  • Collaborate with security, engineering, and product teams on security/compliance.
  • Automate workflows to improve operational efficiency.

🎯 Requirements

  • Proficient in Python; experience with API calls (requests) and JSON.
  • Strong data processing in PySpark and SQL (Hive) for large workloads.
  • Solid software development fundamentals, including Git.
  • Excellent problem-solving and logical thinking; strong communication.
  • Eagerness to learn; ability to own tasks from conception to completion.

🎁 Benefits

  • Shape the future with a leading blockchain ecosystem.
  • Collaborate with world-class talent in a global org.
  • Tackle fast-paced projects with autonomy and innovation.
  • Career growth and continuous learning opportunities.
  • Competitive salary and comprehensive benefits.
  • Work-from-home arrangement may vary by team.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’