Related skills
sql python scala dbt spark📋 Description
- Build and maintain data pipelines and data models for analytics and BI.
- Work across multiple cloud platforms to support 1:1 retargeting and CRM.
- Develop code to run pipelines cost‑effectively and with easy maintenance.
- Create monitoring procedures and tools for ETL quality and reliability.
- Collaborate with CRM developers, data scientists, analysts, and product owners.
- Design cloud‑based data architectures and migrations.
🎯 Requirements
- In‑depth knowledge of Adobe Experience Platform (AEP), schemas, datasets, and audience creation.
- Very strong SQL skills for large‑scale data querying and transformation.
- Experience with Spark, Python, and Scala.
- Experience with cloud data warehouses/pipelines (GCP/AWS/Azure) incl. Dataflow, DBT, EMR, Airflow.
- Knowledge of data governance, data quality practices (profiling, cleansing, validation).
- Familiarity with Adobe Customer Journey Analytics (CJA) and Adobe Journey Optimizer (AJO).
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!