Software Engineer Interoperability

CGT
Pittsburgh, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Pittsburgh, United States of America

Tech stack

JavaScript
API
Apache HTTP Server
Big Data
Google BigQuery
Code Review
Information Engineering
Data Infrastructure
Data Integration
ETL
Data Transformation
Data Security
Data Systems
DevOps
Distributed Computing Environment
Distributed Data Store
Hadoop
Hive
Interoperability
Python
OAuth
Performance Tuning
Software Engineering
Cloud Platform System
Fast Healthcare Interoperability Resources
Spark
Software Security
HybridCloud
Gitlab
PySpark
Information Technology
Api Design
REST
Data Pipelines
Web Api

Job description

We are seeking a highly skilled Software Engineer Interoperability to design, build, and support enterprise-scale healthcare data integration solutions. This role is responsible for developing FHIR-based APIs, building high-volume ETL/ELT pipelines, and supporting interoperability platforms. The position plays a critical role in enabling compliant, scalable, and secure data exchange across healthcare systems while supporting regulatory initiatives and enterprise data strategies., * Design, develop, and maintain FHIR-based interoperability solutions and RESTful APIs

  • Configure and support interoperability platforms, including SmileCDR repositories and FHIR endpoints
  • Build and optimize large-scale ETL/ELT pipelines for clinical, claims, and member data
  • Develop data transformation logic, mapping, and validation processes
  • Integrate third-party healthcare platforms and external APIs
  • Support API security, performance tuning, and monitoring
  • Develop and maintain pipelines using Python, PySpark, and distributed processing frameworks
  • Implement and optimize workflows within Informatica Big Data Management (BDM)
  • Work with modern data platforms including DBT, Starburst/Trino, Apache Iceberg, and cloud environments
  • Build and maintain CI/CD pipelines and support DevOps best practices
  • Collaborate with cross-functional teams including product, architecture, compliance, and engineering
  • Provide production support, troubleshooting, and root cause analysis
  • Participate in code reviews, design discussions, and technical documentation

Requirements

  • Bachelor s degree in Computer Science, Engineering, or related field (or equivalent experience)
  • Minimum 5+ years of experience in software development or data engineering
  • Hands-on experience with healthcare interoperability standards and API development
  • Experience building and maintaining large-scale data pipelines
  • Proven experience working with cloud platforms and distributed data systems, * Experience working with healthcare interoperability standards (HL7 FHIR, US Core, Da Vinci)
  • Ability to support regulatory and compliance-driven initiatives (e.g., CMS mandates)
  • Experience with secure API frameworks (OAuth2, REST standards)
  • Availability to support production systems and participate in on-call or troubleshooting efforts as needed, * Strong expertise in FHIR, SmileCDR, and REST API development
  • Proficiency in Python, PySpark, and JavaScript
  • Experience with ETL/ELT development, data modeling, and transformation
  • Hands-on experience with Informatica BDM and big data ecosystems (Hadoop, Hive, Spark)
  • Familiarity with modern data platforms (DBT, Starburst/Trino, Apache Iceberg, BigQuery)
  • Experience with CI/CD pipelines and DevOps practices (GitLab or similar tools)
  • Strong understanding of data integration, API security, and performance optimization
  • Ability to analyze complex data systems and troubleshoot issues efficiently
  • Strong written and verbal communication skills for technical collaboration
  • Ability to manage multiple priorities in a fast-paced, high-volume environment, * Experience in healthcare payer or provider environments
  • Familiarity with CMS ONC or similar regulatory frameworks
  • HL7 FHIR certification or equivalent hands-on implementation experience
  • Experience supporting enterprise-scale production systems
  • Exposure to hybrid cloud/on-prem data architectures

Apply for this position