Senior Data Engineer

Kforce Inc.
Washington, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Washington, United States of America

Tech stack

API
Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Cloud Computing
Databases
Information Engineering
Data Governance
ETL
Data Systems
Data Warehousing
Relational Databases
File Systems
Python
Meta-Data Management
Operational Databases
Software Engineering
Data Streaming
Data Processing
Large Language Models
Containerization
Data Lake
Kubernetes
Information Technology
Real Time Data
Kafka
Data Management
Machine Learning Operations
Data Pipelines
Docker

Job description

We are seeking a Senior Data Engineer to support mission-critical data platforms within a secure Department of Homeland Security (DHS) environment. This role is responsible for designing, developing, and maintaining scalable, cloud-based data pipelines that enable advanced analytics, real-time data processing, and AI/ML-driven capabilities.

The ideal candidate brings deep experience in modern data engineering practices, cloud technologies, and streaming architectures, with a strong understanding of operating in highly regulated and classified environments., Design, develop, and maintain scalable ETL/ELT data pipelines supporting structured and unstructured data sources.

Ingest and integrate data from APIs, relational databases, file systems, third-party services, and real-time streaming platforms.

Architect and deploy cloud-native data solutions leveraging AWS services.

Build and support data warehousing and data lake architectures optimized for analytics and performance. Implement and manage streaming data pipelines using Apache Kafka or equivalent technologies.

Optimize and maintain data storage solutions across multiple paradigms, including relational, columnar, object, and key-value stores. Orchestrate and monitor complex workflows using Apache Airflow or similar scheduling tools. Collaborate with data scientists and analytics teams to integrate AI/ML and LLM-enabled capabilities into production data pipelines. Ensure adherence to data governance, security, lineage, and compliance requirements within a classified environment.

Troubleshoot, optimize, and improve performance, reliability, and scalability of data systems.

Requirements

Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.

10+ years of experience in data engineering, software engineering, or related roles.

Strong proficiency in Python for data processing and pipeline development. Hands-on experience with AWS cloud services supporting data engineering workloads. Demonstrated experience with Apache Kafka and real-time data streaming architectures. Experience building and maintaining data warehouses and data lakes. Solid understanding of database technologies and storage trade-offs. Familiarity with workflow orchestration tools such as Apache Airflow. Experience working in secure, regulated, or classified environments.

Ability to work on-site at DHS Headquarters in Washington, DC., Experience with data cataloging, metadata management, or semantic layer design. Exposure to containerization and orchestration technologies (Docker, Kubernetes). Familiarity with MLOps platforms or AI/ML deployment pipelines. Prior experience supporting DHS, federal law enforcement, or Intelligence Community programs.

Clearance Requirement

Active TS/SCI clearance OR Ability to successfully obtain DHS EOD SCI

Apply for this position