Related skills
bigquery aws snowflake sql python๐ Description
- Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
- Increase robustness of production pipelines with monitoring, testing, and docs templates.
- Collaborate with Data/Analytics and Business Systems teams to scale product usage data and ensure real-time ingestion.
- Build custom applications and integrations to automate manual tasks for Product Operations/Support/SRE.
- Contribute to the health of data systems by designing and documenting good data usage practices.
๐ฏ Requirements
- Bachelor's degree in Computer Science or related field.
- 3+ years of experience in a similar role.
- Python and SQL scripting or development experience.
- Experience with Dataiku, Snowflake, Databricks, or BigQuery.
- Knowledge of AWS, Azure, and GCP cloud ecosystems.
- Experience orchestrating pipelines and data governance practices.
๐ Benefits
- Equal opportunity employer.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!