Resident Solutions Architect

Zachary Piper
McLean, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

McLean, United States of America

Tech stack

Amazon Web Services (AWS)
Data analysis
Azure
Big Data
Cloud Computing
Information Engineering
Data Systems
Distributed Data Store
Python
RSA (Cryptosystem)
Spark
Data Management
Machine Learning Operations

Job description

We are seeking a Resident Solutions Architect (RSA) to join a professional services team supporting high-impact data and analytics initiatives within a federal environment. In this hands-on, customer-facing role, you will design, build, and implement scalable big data and cloud-based solutions while guiding stakeholders through successful adoption of modern data platforms. This position is ideal for a senior-level technologist who enjoys solving complex problems, working directly with customers, and delivering production-ready solutions., Deliver short- to medium-term professional services engagements focused on data engineering, analytics, and cloud technologies Design and implement reference architectures and production-grade data solutions Partner with engagement managers and stakeholders to define scope, timelines, and technical approach Lead end-to-end implementation efforts, including design, build, testing, and deployment Advise customers on architecture, best practices, and platform adoption strategies Support data platform migrations and modernization initiatives Provide escalated technical support during active engagements Collaborate with cross-functional technical, project management, and customer teams Contribute feedback to internal engineering and support teams to improve solutions and delivery Create clear documentation and deliver whiteboard-style technical explanations for diverse audiences

Requirements

6+ years of experience in data engineering, analytics, or data platform development Strong hands-on experience with Apache Spark and distributed data processing Proficiency in Python or Scala Experience working across at least two major cloud environments (AWS, Azure, or GCP) with deep expertise in one Proven ability to design and deploy end-to-end, scalable data architectures Familiarity with CI/CD pipelines and production deployment workflows Working knowledge of MLOps concepts and practices Experience delivering technical projects with defined scope and timelines Strong client-facing communication, documentation, and whiteboarding skills, Experience supporting federal or regulated environments Background in data platform migrations or large-scale modernization efforts Prior consulting or professional services experience

Benefits & conditions

This role may require up to 30% travel, depending on project needs Work is performed in collaboration with technical, project management, and customer teams

Apply for this position