nSenior Data Engineer

Leidos, Inc.
31 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 167K

Job location

Tech stack

API
Agile Methodologies
Artificial Intelligence
Amazon Web Services (AWS)
Azure
Information Systems
Continuous Integration
Information Engineering
Data Governance
Data Integration
ETL
Data Systems
Machine Learning
Meta-Data Management
NIPRNet
Data Streaming
Google Cloud Platform
Feature Engineering
Kubernetes
Information Technology
Data Analytics
Amazon Web Services (AWS)
Non-relational Database
Data Management
Data Pipelines
Devsecops
Docker

Requirements

  • Active Top Secret (TS) clearance with SCI eligibility.\n
  • Bachelor's degree in Computer Science, Data Science, Engineering, Information Systems, or related technical discipline and 8-12 years of relevant experience OR Master's degree in a related field and 6-10 years of relevant experience.\n
  • Minimum of 8 years of experience in data engineering or related roles.\n
  • Proven experience in designing and implementing data pipelines and architectures.\n
  • Experience with data quality monitoring tools and processes.\n
  • Excellent communication and interpersonal skills.\n
  • Experience developing and maintaining enterprise-scale data pipelines in cloud environments (AWS, Azure, or GCP).\n
  • Experience implementing ETL/ELT processes and data orchestration frameworks.\n
  • Strong knowledge of ETL processes and data integration techniques.\n
  • Experience working with structured and unstructured data sources, including APIs, streaming platforms, and relational/non-relational databases.\n
  • Experience integrating data pipelines into DevSecOps CI/CD environments.\n, * Active TS/SCI clearance.\n
  • Experience with cloud-based data solutions and architectures.\n
  • Familiarity with data governance frameworks and best practices.\n
  • Experience operating within SAFe or large-scale Agile frameworks supporting enterprise systems.\n
  • Experience supporting data platforms across NIPRNet, SIPRNet, and JWICS environments.\n
  • Experience implementing data governance controls, lineage tracking, and metadata management frameworks.\n
  • Experience supporting AI/ML data preparation and feature engineering workflows.\n
  • Knowledge of machine learning and AI principles.\n
  • Experience working with containerized environments (e.g., Docker, Kubernetes).\n
  • Relevant cloud or data engineering certifications (e.g., AWS Data Analytics, Azure Data Engineer, or equivalent).\n
  • Certifications in data engineering or related technologies (e.g., AWS Certified Data Analytics, Google Cloud Professional Data Engineer).\n

Benefits & conditions

n In this role, you will work alongside government partners, engineers, and other industry teammates to translate operational and strategic requirements into scalable, production-ready solutions. You will contribute directly to product planning, execution, and continuous improvement-helping ensure capabilities are delivered efficiently, aligned to mission priorities, and positioned for sustained success.\n \n This position offers the opportunity to work on a high-visibility, enterprise program at the intersection of data, analytics, and emerging AI technologies. Ideal candidates are motivated by mission impact, comfortable operating in complex stakeholder environments, and interested in building deep domain expertise while delivering capabilities with real-world national security outcomes.\n \n \nPrimary Responsibilities:\n \n \n

  • Design, build, and maintain data pipelines and architectures to support data ingestion, transformation, integration, storage, and dissemination.\n
  • Apply software engineering and ETL principles to ensure data accuracy, quality, consistency, and scalability.\n
  • Integrate COTS and customer-developed tools within existing data frameworks to meet operational and analytical requirements.\n
  • Collaborate with DataOps teams to prepare, automate, and optimize data workflows for real-time analytics.\n
  • Implement and enforce data security policies, including data encryption and access controls.\n
  • Monitor data quality and implement proactive alerting on data pipelines.\n
  • Develop and maintain documentation for data pipelines, models, and governance policies.\n
  • Provide Tier-2 and Tier-3 support for enterprise data products and services.\n
  • Conduct root cause analysis for recurring issues and implement solutions to prevent future occurrences.\n
  • Develop and maintain training materials and a centralized knowledge repository for data operations.\n
  • Proactively communicate with customers regarding known issues and new features.\n
  • Collect customer feedback to identify areas for improvement and enhance customer satisfaction.\n
  • Foster a collaborative team environment that encourages innovation and continuous improvement.\n
  • Ensure compliance with all applicable regulations related to data privacy and security.\n

Apply for this position