Senior Cloud Data Architect

Boeing Company
Seattle, United States of America
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 230K

Job location

Seattle, United States of America

Tech stack

Agile Methodologies
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Architectural Patterns
JIRA
CA Workload Automation Ae
Azure
Big Data
Software as a Service
Cloud Computing
Cloud Database
Information Systems
Continuous Delivery
Data Architecture
Information Engineering
Data Governance
Data Integration
ETL
Data Security
IBM InfoSphere DataStage
DevOps
Github
Hadoop
Identity and Access Management
Metadata
Meta-Data Management
Performance Tuning
Systems Development Life Cycle
Role-Based Access Control
Prometheus
Data Streaming
Unstructured Data
Workflow Management Systems
Datadog
Spark
Amazon Web Services (AWS)
Gitlab
Information Technology
Amazon Web Services (AWS)
Kafka
Data Management
Data Lakehouse
Terraform
Data Pipelines
Devsecops
Jenkins
Databricks

Job description

Boeing has a current need for a Senior Cloud Data Architect to deliver enterprise data pipelines and platform components on AWS and Databricks; hands-on contributor who drives implementation, performance tuning, and mentoring. Drive the modernization of legacy ETL pipelines to a scalable, configuration-driven ETL framework running on AWS., * Lead large scale ETL modernization initiative migrating legacy pipelines (like DataStage, GoldenGate, HVR, etc.,) to a scalable, configuration-driven, metadata-based ETL framework, and ensure adherence to data governance, security, and compliance standards.

  • Lead the implementation of a metadata-driven, reusable ETL framework on AWS cloud data platform and champion repeatable, self-service cloud and data architecture patterns that enable teams to deploy scalable, high-performant, maintainable, and compliant data pipelines autonomously across the enterprise.
  • Lead end-to-end data integration and ETl/ELT processes to ingest, transform and deliver complex structured and unstructured data into a governed Data Lakehouse, enabling seamless access for analytics, reporting and data science workloads.
  • Designing and solutioning cloud-native & Cloud agnostic data platforms and data engineering solution on AWS, and experience in SaaS products like Databricks to ensure portability, resilience and consistent governance across environments
  • Drive automation, DevOps/DevSecOps, and Infrastructure as Code (IaaC) initiatives to deliver repeatable, testable, and deployable artifacts and accelerate migrations.
  • Troubleshoot and resolve implementation issues throughout the SDLC; monitor architecture compliance and operational health.
  • Design and configure data pipelines with enterprise orchestration and scheduling tools, and establish monitoring, alerting, and operational runbooks for production support.
  • Provide technical leadership, mentorship, and guidance to ETL engineering teams, provide best coding practices, enable team in automation strategies and tools, conduct peer reviews, and knowledge sharing across distributed teams.
  • Build and maintain strong relationships with vendors, partners, and cross-functional teams, own stakeholder communications and collaboration channels, and drive accountability and organizational change through regular updates to product managers, DBAs, architects, and senior leadership.
  • Operationalize and standardize cloud platforms (AWS/Azure), applying architecture patterns, guardrails, and enterprise standards for scalability, reliability, security, compliance, and cost control.

Requirements

  • Bachelor's Degree or higher in Computer Science, Engineering, Information Systems, or equivalent practical experience]
  • Demonstrated ability to lead technical initiatives, mentor peers, and communicate effectively across distributed teams.
  • 5+ years` experience with ETL tools and patterns (e.g., DataStage, Informatica) and building repeatable ETL/ELT pipelines
  • 5+ years` hands-on experience building large-scale big data applications using Databricks / Apache Spark; familiarity with Hadoop and Kafka is a plus; demonstrable production performance tuning experience.
  • 3+ years of experience in designing and implementing metadata-driven, pattern-based ETL/ELT frameworks.
  • 3+ years working with AWS data services and core managed services (S3, VPC, IAM, KMS, Secrets Manager, EC2) and cloud data lake/warehouse concepts.
  • 3+ years` implementing CI/CD and DevOps practices for data workloads (GitHub/GitLab, Terraform, Jenkins or equivalent
  • 3+ years` experience with orchestration tools (Airflow, Autosys, Databricks Workflows).
  • Hands-on experience with ingestion patterns: batch, streaming, and CDC
  • Strong skills in performance tuning and optimization of new and migrated data pipelines

Preferred Skills (Nice To Have)

  • 5+ years` exposure to data security, governance, and compliance practices (encryption, RBAC, metadata management); familiarity with FedRAMP, NIST, and GDPR.
  • Experience migrating medium-to-large pipelines to cloud - include scale if possible (e.g., TBs/day, number of pipelines).
  • Familiarity with observability and lineage tooling (Datadog, Prometheus, OpenLineage, Unity Catalog, etc.).
  • Experience with Agile software development lifecycle and tooling (ADO, JIRA)

Typical Education/Experience:

  • Education/experience typically acquired through advanced education (e.g. Associate) and typically 2 or more years' related work experience or an equivalent combination of education and experience (e.g. Bachelor+1 years' related work experience, 5 years' related work experience, etc.)., This position must meet U.S. export control compliance requirements. To meet U.S. export control compliance requirements, a "U.S. Person" as defined by 22 C.F.R. §120.62 is required. "U.S. Person" includes U.S. Citizen, U.S. National, lawful permanent resident, refugee, or asylee., Bachelor's Degree or Equivalent Required

Benefits & conditions

At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.

The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and several programs that provide for both paid and unpaid time away from work.

The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.

Pay is based upon candidate experience and qualifications, as well as market and business considerations.

Pay Range is dependent on geographical location and experience:

Senior (level 5) - $182,700 - $230,000

Senior (level 6) - $218,700 - $280,000

About the company

At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.

Apply for this position