Data Engineer

SZNS SOLUTIONS LLC
Reston, United States of America
8 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Reston, United States of America

Tech stack

Java
Artificial Intelligence
Airflow
Google BigQuery
Computer Programming
Information Engineering
Data Infrastructure
ETL
Data Security
Data Stores
Data Warehousing
Disaster Recovery
Fault Tolerance
Data Flow Control
Graph Database
Python
PostgreSQL
Machine Learning
MongoDB
Neo4j
NoSQL
Cloudera
SQL Databases
Data Streaming
Unstructured Data
Workflow Management Systems
Google Cloud Platform
Snowflake
Spark
Generative AI
Data Lake
Information Technology
Apache Flink
Kafka
Data Pipelines
Automation Anywhere
Apache Beam
Databricks

Job description

  • ETL/ELT Pipelines: Architect and deploy pipelines to ingest, transform, and store data from high volume, disparate sources for real-time analysis
  • Build for the Enterprise: Create a highly reliable single source of truth for enterprise intelligence and enablement
  • AI Workflow Enablement: Architect and optimize production-grade data foundations to support high-performance AI workflows and automated decision-making.
  • Operations & Governance: Establish and automate strict data security, quality assurance, and governance processes. Design systems for high fault tolerance and rapid disaster recovery
  • Efficiency: Design and model for efficient queries, resource usage, workload scheduling, and cost

Requirements

  • Bachelor's degree in Computer Science, Engineering, or equivalent practical experience
  • 5+ years of experience in data engineering, within a cloud environment, demonstrating a clear progression from engineering into architectural design
  • Proficiency in SQL and strong programming skills in Python, Rust, or Java
  • Experience building and maintaining data pipelines using processing/streaming frameworks (e.g., Kafka, Flink, Beam, Spark) and orchestration tools (e.g., Airflow)
  • Experience architecting data stores and schemas for AI workflows (e.g., RAG)
  • Active Google Cloud certifications, or willingness to obtain within one month of joining
  • Builder mentality and bias for action
  • US Citizen

Preferred Qualifications

  • Deep expertise in the Google Cloud Platform (GCP) ecosystem, specifically building streaming and batch pipelines using Dataflow (Apache Beam), Pub/Sub, BigQuery, and Cloud Composer (Airflow)
  • Strong background in data modeling and architecture across relational (e.g., PostgreSQL), NoSQL (e.g., Firestore, MongoDB), and graph databases (e.g., Neo4j), including modern cloud data warehouses (e.g., BigQuery) and data lakes (e.g., GCS, Dataproc)
  • Demonstrated experience setting up infrastructure for modern data science, machine learning, or Generative AI (e.g., preparing unstructured data, vector databases, RAG pipelines)
  • Familiarity with regulatory compliance frameworks (FedRAMP, HIPAA, etc.) and security strategies
  • Experience with modern data platforms like Snowflake or Databricks

Benefits & conditions

  • Competitive salary and benefits package
  • Hybrid work environment (MWF in-person in our Reston office)
  • A collaborative and innovative work environment
  • Continuous learning and development opportunities

About the company

"SZNS Solutions (pronounced "seasons") is a technology consulting company and Google Cloud Partner based in Reston VA. We specialize in delivering agentic AI and cloud computing solutions. Founded by ex-Googlers with engineers from Google, Amazon, and Capital One, SZNS differentiates itself particularly in AI, data engineering, blockchain, and cloud-native software application development."

Apply for this position