DevOps / DataOps Engineer (Private AI / AI Workbench)
Role details
Job location
Tech stack
Job description
Overview: The DevOps/DataOps Engineer builds and operates the deployment and data foundations for AI Workbench. This role automates infrastructure provisioning, manages Kubernetes/container operations, builds CI/CD pipelines, and implements secure data pipelines and observability so AI-driven solutions run reliably in private or hybrid environments., * Provision and manage environments using Infrastructure as Code (IaC) for compute, networking, storage, and security controls.
- Administer Kubernetes/container platforms to host AI Workbench services, agent runtimes, model endpoints, and supporting components.
- Build and maintain CI/CD pipelines for applications/services, agent configurations, and infrastructure updates with automated checks.
- Implement DataOps pipelines for RAG ingestion: secure connectors, preprocessing jobs, scheduling, data quality checks, and lineage tracking.
- Implement observability: logs, metrics, traces, dashboards, alerting, and SLO/SLI monitoring across platform and workloads.
- Harden environments: secrets management, vulnerability scanning, image signing, policy-as-code, and least-privilege access.
- Support release management, incident response, and operational handover including runbooks and knowledge transfer.
- Optimize performance and cost: resource sizing, autoscaling policies, GPU scheduling, and storage optimization.
Requirements
- 3+ years in DevOps/SRE and/or DataOps roles supporting enterprise platforms.
- Hands-on experience with Kubernetes and container tooling.
- Experience building CI/CD pipelines and IaC automation.
- Strong knowledge of security practices in platform operations.
- Experience with data pipelines and ETL/ELT concepts.
Preferred Qualifications
- Experience supporting AI/ML platforms, GPU workloads, or model inference services.
- Experience with policy-as-code (OPA/Gatekeeper-like concepts) and compliance-driven operations.
- Familiarity with vector databases and search indexing operations.
- Experience with hybrid connectivity and network security patterns., * Kubernetes, containers, Helm/GitOps concepts
- IaC (Terraform/Ansible), CI/CD
- Data pipelines, scheduling, and data quality
- Security hardening, secrets, scanning
- Monitoring/observability (metrics, logs, tracing)
Work Conditions
- US-based role; onsite 5 days a week (Arlington, VA/DC Area)
- Works with Architecture, DevOps/DataOps, QA, and Product roles to deliver end-to-end solutions.
- Secret Clearance and U.S. Citizenship.
- 5+ years of experience in database administration and management., If you are an individual with a disability, a disabled veteran, or a wounded warrior and you are unable or limited in your ability to access or use this site as a result of your disability, you may request a reasonable accommodation by contacting us via email (gss-hr-er@dxc.com) .
Benefits & conditions
Compensation at DXC is influenced by an array of factors, including but not limited to the experience, job-related knowledge, skills, competencies, as well as contract-specific affordability and organizational requirements. A reasonable estimate of the current compensation range for this position is $121,300 - $225,300.
Full-time hires are eligible to participate in the DXC benefit program. DXC offers a comprehensive, flexible, and competitive benefits program which includes, but is not limited to, health, dental, and vision insurance coverage; employee wellness; life and disability insurance; a retirement savings plan, paid holidays, paid time off.
At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We're committed to fostering an inclusive environment where everyone can thrive.