Data Engineer

EVER FORTH LLC
Fairfax, United States of America
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 145K

Job location

Remote
Fairfax, United States of America

Tech stack

Query Performance
JavaScript
Data analysis
Automation of Tests
Batch Processing
Unix
Program Optimization
Continuous Integration
Data Governance
Data Integrity
ETL
Data Mapping
Data Mart
Data Mining
Data Warehousing
Database Queries
Database Schema
Java Platform Enterprise Edition (J2EE)
Iterative and Incremental Development
Spring
Python
OAuth
Oracle Applications
Software Tools
Shell Script
SQL Databases
Teradata
Web Services
Data Processing
Scripting (Bash/Python/Go/Ruby)
Google Cloud Platform
Data Ingestion
Spark
Caching
Indexer
GIT
Containerization
Kafka
REST
Data Pipelines
Database Tools and Utilities
Databricks
Microservices

Job description

  • Implement, and optimize data pipeline architectures for data sourcing, ingestion, transformation, and extraction processes, ensuring data integrity, consistency, and compliance with organizational standards.
  • Develop and maintain scalable database schemas, data models, and data warehouse structures; perform data mapping, schema evolution, and integration between source systems, staging areas, and data marts.
  • Automate data extraction workflows and develop comprehensive technical documentation for ETL/ELT procedures; collaborate with cross-functional teams to translate business requirements into technical specifications and data schemas.
  • Establish and enforce data governance standards, including data quality metrics, validation rules, and best practices for data warehouse design, architecture, and tooling.
  • Develop, test, and deploy ETL/ELT scripts and programs using SQL, Python, Spark, or other relevant languages; optimize code for performance, scalability, and resource utilization.
  • Implement and tune data warehouse systems, focusing on query performance, batch processing efficiency, and resource management; utilize indexing, partitioning, and caching strategies.
  • Perform advanced data analysis, validation, and profiling using SQL and scripting languages; develop data models, dashboards, and reports in collaboration with stakeholders.
  • Conduct testing and validation of ETL workflows to ensure data loads meet scheduled SLAs and business quality standards; document testing protocols, results, and remediation steps.
  • Perform root cause analysis for data processing failures, troubleshoot production issues, and implement corrective actions; validate data accuracy and consistency across systems; support iterative development and continuous improvement of data pipelines.

Salary Range: $130,000-$145,000

Requirements

  • 5-10+ years of experience building and designing data extraction, formatting and engineering tools, workflows, pipelines and ETL / ELT processes.
  • Detail oriented with strong analytical and problem-solving skills
  • Ability to use database tools, techniques, and applications (e.g., Teradata, Oracle, Non-Relational) to develop complex SQL statements (e.g., multi-join), and to tune and troubleshoot queries for optimal performance.
  • Skill using Unix/Linux shell scripting to develop and implement automation scripts for Extract, Transfer Load (ETL) processes.
  • Communications skills (both verbal & written) - ability to work and communicate with all levels in team structure
  • Team player with the ability to prioritize and multi-task, work in a fast-paced environment, and effectively manage time.
  • Java/J2EE and REST APIs, Web Services and building event-driven Micro Services and Kafka streaming using Schema registry, OAuth authentication.
  • Spring Framework and Google Cloud Platform Services in public cloud infrastructure, Git, CI/CD pipeline and containerization, data ingestion/data modeling
  • Develop Microservices using Java/J2EE Spring for ingesting large volume real-time events into Kafka topics. Architect solutions that make the data available to consumers in real time

Desired Skills

  • Fluent with Databricks concepts and terminology, such as workspace catalog
  • Highly competent in SQL, Python, Spark, JavaScript, and other data engineering tools
  • Experienced in Unix and Linus scripting

About the company

Everforth ECS is the federal segment of Everforth , a $4B global organization with over 10,000 employees. Our nearly 3,500 professionals deliver advanced technology solutions in data and AI, cybersecurity, and enterprise transformation, serving defense, intelligence, and federal civilian agencies. Our work powers mission-critical outcomes, strengthens technology partnerships, and creates meaningful opportunities for our people. We are defined by a commitment to excellence in delivery, a culture of innovation, and an environment where talent can thrive and grow.

Apply for this position