Data Engineer

General Dynamics IT
Chantilly, United States of America
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, German
Experience level
Senior

Job location

Chantilly, United States of America

Tech stack

Java
API
Amazon Web Services (AWS)
Apache HTTP Server
Azure
Business Systems
Cloud Database
Continuous Integration
Data as a Services
Data Governance
Data Integration
Data Transformation
Electronic Data Interchange (EDI)
Github
Python
Maven
Data Streaming
Systems Architecture
Systems Integration
Data Processing
Google Cloud Platform
Cloud Platform System
System Availability
Containerization
Gitlab-ci
low-code
Real Time Data
GraphQL
Data Pipelines
Api Management
Jenkins

Job description

As a Data Engineer Senior, the work youll do at GDIT will be impactful to the mission of U.S. Army Europe. You will play a crucial role in providing Mission Command Support for U.S. Forces in the European and African AOR.

  • Design, develop, and maintain end-to-end data integration pipelinesacross MSS, Army enterprise platforms, and cloud environments (AWS, Azure, GCP)
  • Build and manage API integrations (REST/GraphQL)to enable secure, scalable data exchange between mission systems, business systems, and external partners
  • Develop data transformation and processing logicusing Python and/or Java, supporting both batch and real-time data workflows
  • Leverage low-code/no-code data integration toolssuch as Apache NiFito design, orchestrate, and monitor data flows, including ingestion, routing, transformation, and enrichment
  • Perform data manipulation and transformation(cleansing, normalization, aggregation, schema mapping) across structured and unstructured data sources
  • Implement and orchestrate workflows within Maven Smart Systems (MSS), embedding data governance, validation, and automation into operational pipelines
  • Integrate MSS with cloud-based data lakes and analytics platforms, ensuring reliable ingestion, transformation, and access to data
  • Design and implement CI/CD pipelinesfor data workflows and integration services using tools such as GitLab CI, Jenkins, or GitHub Actions
  • Utilize containerization and orchestration toolslike Dockerand Kubernetesto deploy scalable data services
  • Apply infrastructure as codeusing Terraformor AWS CloudFormationto provision and manage platform resources
  • Monitor and optimize data pipelines, APIs, and system performance, ensuring high availability and reliability
  • Collaborate with data engineers, analysts, and system owners to define data contracts, schemas, and integration patterns
  • Maintain documentation for APIs, pipelines, workflows, and system architecture

Requirements

  • Education: Bachelor of Science
  • Experience: 5+ years of related experience
  • Technical skills: Experience with AWS, Azure and Google Cloud Platform. Experience with implementing CI/CD pipelinesfor data workflows and integration services using tools such as GitLab CI, Jenkins, or GitHub Actions
  • Security clearance level: Secret Level Security Clearance
  • US citizenship required
  • Role requirements: DoD 8570 IAT Level II Certification
  • Must be able to obtain German TESA

Benefits & conditions

At GDIT, the mission is our purpose, and our people are at the center of everything we do.

  • Growth: AI-powered career tool that identifies career steps and learning opportunities
  • Support: An internal mobility team focused on helping you achieve your career goals
  • Rewards: Comprehensive benefits and wellness packages, 401K with company match, and competitive pay and paid time off
  • Community: Award-winning culture of innovation and a military-friendly workplace

Apply for this position