This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →
← Back to all jobs
Added
25 days ago
Location
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Save job

Data Architect (Remote - Romania)

Department: Product ArchitectureReports to: Chief Architect

This Data Architect role leads the design and delivery of our data architecture and pipelines, integrating clinical and financial data under GxP and SOX compliance. The position focuses on modernizing the data architecture, building efficient data pipelines, and advanced modeling to support analytics and reporting. Candidates should bring deep AWS data architecture expertise, strong data engineering skills, and experience guiding cross-functional teams in regulated environments.

Responsibilities:

  • Define and communicate the reference architecture for a secure, compliant cloud data platform on AWS, harmonizing clinical and financial data under GxP and SOX controls.
  • Design curated, versioned data zones using S3, AWS Glue Catalog, and Iceberg/Delta-style file formats to optimize performance and cost.
  • Modernize ingestion and ELT processes by establishing real-time (AWS DMS, PySpark
  • Structured Streaming) and batch (Glue Jobs, PySpark) pipelines, replacing legacy ETL with declarative, testable workflows for clinical and payment data.
  • Lead the design and implementation of dimensional and data-vault models, unifying subject, site, supply chain, and payment domains; govern semantic layers in Athena/Redshift Spectrum and Tableau/QuickSight for analytics and AI enablement.
  • Develop and maintain secure, versioned GraphQL endpoints and data services to provide streamlined access to curated datasets for strategic reporting.
  • Champion data governance and data quality by defining data contracts, schema evolution strategies, lineage, and automated validation—ensuring auditability across the SDLC.
  • Mentor and influence engineering teams of 20+ engineers, lead architecture reviews, collaborative design, and internal workshops.
  • Serve as a technical advisor to product, security, and compliance stakeholders.

Requirements:

  • At least 10 years designing and delivering data-intensive systems, with at least 3 years in a principal-level or architecture role.
  • Direct experience with clinical-trial, life-sciences, or financial-ledger data.
  • Knowledge of ledger-based payments processing and reconciliation.
  • Deep expertise with AWS analytics stack: S3, Glue/Glue Studio & Catalog, Athena,Redshift, Lake Formation, Lambda, EMR or EKS-hosted PySpark.
  • Experience building GraphQL or REST data services that front a lakehouse or warehouse.
  • Mastery of data modeling (dimensional, data vault, lakehouse patterns) and performance tuning for large-scale analytical workloads.
  • Hands-on with Python & PySpark, SQL, and infrastructure-as-code (Terraform/CDK).
  • Familiarity with regulatory / GxP / SOX environments and secure-by-design principles (encryption, tokenization, IAM, PII/PHI segregation).
  • Experience guiding cross-functional teams in an Agile/DevOps/SRE culture.

We are aware that an individual(s) are fraudulently representing themselves as Suvoda recruiters and/or hiring managers. Suvoda will never request personal information such as your bank account number, credit card number, drivers license or social security number — or request payment from you — during the job application or interview process. Any emails from the Suvoda recruiting team will come from a @suvoda.com email address. You can learn more about these types of fraud by referring to this FTC consumer alert.

As set forth in Suvoda’s Equal Employment Opportunity policy, we do not discriminate on the basis of any protected group status under any applicable law.

If you are based in California, we encourage you to read this important information for California residents linked here.

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to On site Data Jobs. Just set your preferences and Job Copilot will do the rest—finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →