Related skills
sql python dbt airflow spark📋 Description
- Build data pipelines and models for analytics and BI.
- Leverage cloud platforms for 1:1 retargeting and CRM.
- Identify, collect, and integrate data from multiple sources.
- Develop and optimize code for cost-effective pipelines.
- Create monitoring tools to ensure ETL data quality.
- Collaborate with CRM teams, data scientists, and analysts on audiences and reporting.
🎯 Requirements
- 5+ years data engineering in Martech.
- Proficient in SQL, Python, Spark; Scala helpful.
- Experience building data pipelines and models for analytics.
- Experience with Adobe Experience Platform (AEP), CJA/AJO.
- Familiar with cloud data warehousing, DBT, Airflow, EMR, Dataflow.
- Knowledge of data governance, quality, profiling; GitHub.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!