Related skills
data engineering etl sql python graphql📋 Description
- Work across all aspects of data from engineering to building sophisticated visualizations, machine learning models and experiments
- Analyze and interpret large (PB-scale) volumes of transactional, operational and customer data using proprietary and open source data tools, platforms and analytical tool kits
- Translate complex findings into simple visualizations and recommendations for execution by operational teams and executives
- Processing confidential data and information according to guidelines
- Managing and designing the reporting environment, including data sources, security, and metadata
- Troubleshooting the reporting database environment and reports
🎯 Requirements
- Bachelor’s degree from an accredited university or college in Computer Science or Math or Statistics
- Proficient in data engineering, modeling and ETL - preferred experience with data sourcing and working with APIs
- Experience with data querying using SQL, GraphQL, Python
- Able to commit minimum 3 days per week for at least 6 months
- Understands project tokenomics and has good knowledge of the DeFi and Web 3.0 infrastructure landscape
- Experience in using tools such as Dune analytics, Nansen etc
🎁 Benefits
- Shape the future with the world's leading blockchain ecosystem
- Collaborate with world-class talent in a global, flat organization
- Tackle fast-paced projects with autonomy in an innovative environment
- Thrive in a results-driven workplace with growth and learning
- Competitive salary and company benefits
- Work-from-home arrangement; may vary by team
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!