Cloud Developer Senior

General Dynamics IT
Washington, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Remote
Washington, United States of America

Tech stack

Agile Methodologies
Airflow
Amazon Web Services (AWS)
Cloud Computing
Configuration Management
Continuous Integration
Information Engineering
Data Files
ETL
Data Systems
Linux
DevOps
Python
Machine Learning
Ansible
Software Engineering
SQL Databases
Gitlab
Kubernetes
Information Technology
Machine Learning Operations
Terraform
Software Version Control
Data Pipelines
Databricks

Job description

GDIT is your place. You make it your own by bringing your ideas and unique perspective to our culture. By owning your opportunity at GDIT, you help ensure today is safe and tomorrow is smarter. Our work depends on a Cloud Developer Senior joining our team to support our customer activities in Washington, DC. This position can be performed remotely., * Provides enterprise-level technical support to leadership to align IT systems and data solutions with organizational goals.

  • Develops and maintains scalable, secure, and integrated system architectures
  • Designs, develops, and implements methods, processes and systems to consolidate and analyze diverse data sets, both structured and unstructured.
  • Proficient in building infrastructure pipeline required for optimal extraction, transformation, and loading of data from a wide variety of data sources.

Requirements

  • Strong experience in Kubernetes orchestration for scalable deployment environments
  • Expertise in software development, preferably using Python
  • Knowledge of machine learning model deployment practices
  • Familiarity with ML orchestration tools (e.g., Kubeflow, MLflow, Airflow, SageMaker, or similar)
  • Experience with infrastructure-as-code using Terraform, and OpenTofu.
  • Proficiency with GitLab for source control, CI/CD, and DevOps workflows
  • Leverages expertise in SQL, ETL automation, data pipelines, and cloud platforms (AWS).
  • Familiarity with Linux (RHEL9)

Preferred Additional Qualifications:

  • Hands-on experience with Ansible for configuration management and automated provisioning
  • Experience with Databricks for large-scale data engineering, ML workflows, and collaborative analytics
  • Experience with Agile methodology

Preferred Certification:

  • Certified Kubernetes administrator
  • Any machine learning certification

Education:

Requires a BA/BS degree in a related discipline.

Qualifications:

  • 5+ years of DevOps experience

Additional Requirements:

  • This position requires an existing Public Trust or the ability to obtain one.

Apply for this position