Added
3 days ago
Type
Full time
Salary
Salary not provided

Related skills

aws snowflake postgresql sql python

πŸ“‹ Description

  • Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
  • Model, store, and transform data to support analytics, reporting, and downstream applications.
  • Build API-based and file-based integrations across enterprise platforms.
  • Ensure compliance with federal security and data governance frameworks (HIPAA, NIST, RMF, CMMC).
  • Implement monitoring, logging, and data quality checks for reliable, scalable pipelines.
  • Partner with analysts and data science teams to deliver trusted AI-ready datasets.

🎯 Requirements

  • 5+ years of experience in data engineering or data platform development.
  • Strong SQL and experience with PostgreSQL, Oracle, SQL Server.
  • Hands-on with ETL/ELT tools (Talend preferred; Apache Spark, Apache Airflow, dbt, or Informatica).
  • Experience with modern data platforms (Snowflake, Redshift, BigQuery, or equivalent).
  • Proficiency in Python for building and automating pipelines.
  • Experience operating in cloud environments (AWS preferred, GovCloud-style environments).
  • Understanding of data lake or lakehouse architectures.
  • Bachelor's Degree

🎁 Benefits

  • Fully remote
  • Tech & Education Stipend
  • Comprehensive Benefits Package
  • Company Match 401(k) plan
  • Flexible PTO, Paid Holidays
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’