Data Architect

FEDERAL EXPRESS CORP
Akron, United States of America
8 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, Spanish
Experience level
Senior
Compensation
$ 18K

Job location

Remote
Akron, United States of America

Tech stack

Query Performance
Artificial Intelligence
Airflow
Google BigQuery
Cloud Engineering
Cloud Storage
Information Systems
Data Architecture
Information Engineering
ETL
Data Systems
Data Warehousing
Data Flow Control
Python
SQL Databases
Data Streaming
Google Cloud Platform
Cloud Platform System
Real Time Systems
Data Ingestion
Data Lake
Information Technology
Deployment Automation
Kafka
Data Management
Physical Data Models
Terraform
Data Pipelines
Confluent

Job description

Under limited supervision, creates project level data architecture artifacts for specific projects., * Cloud-Native Architecture Design: Lead the architectural design and implementation of scalable, high-performance data platforms (Data Lakes, Data Warehouses) leveraging Google Cloud Platform (GCP) services such as BigQuery and Cloud Storage.

  • End-to-End Pipeline Engineering: Architect and optimize automated, robust data ingestion and transformation pipelines for both batch and real-time processing using Dataflow, Confluent Kafka, Pub/Sub, and Cloud Composer.
  • Advanced Data Modeling: Translate complex business requirements into high-quality conceptual, logical, and physical data models, ensuring structural integrity and alignment with modern analytics and AI/ML workloads.
  • Platform Modernization & Migration: Drive the strategic transition from legacy on-premise or hybrid data systems to cloud-native GCP solutions, defining modernization roadmaps and ensuring data consistency during re-platforming.
  • Technical Leadership & Collaboration: Partner with cross-functional engineering and product teams to provide architectural guidance, establish standard methodologies for development, and ensure technical solutions meet business performance goals.
  • Performance & Cost Optimization: Continuously monitor and refine the data platform architecture to maximize query performance and throughput while implementing strategies to optimize cloud resource consumption and operational costs.

Requirements

  • GCP Data Stack Mastery: Deep technical expertise in architecting solutions using BigQuery, Dataflow, Pub/Sub, Confluent Kafka, and Cloud Composer.
  • Advanced Data Engineering: Proficient in Dimensional, Vault, and Relational modeling and designing high-throughput ETL/ELT pipelines for complex batch and real-time streaming workloads.
  • Automation & Optimization: Strong command of SQL, Python, and Infrastructure-as-Code (Terraform) to automate deployments and optimize platform performance and cost.

Minimum Education

Bachelor's degree, in computer science, engineering, information systems and/or equivalent formal training.

Minimum Experience

Five (5) years equivalent work experience in information technology or engineering environment., This position can be domiciled anywhere in the United States. The ability to work remotely within the United States may be available based on business need., * Right to Work Notice (English) / (Spanish)

Benefits & conditions

Pay: USA: $8,007.29/mo - $18,149.85/mo, CO: $8,007.29/mo - $17,393.61/mo, CA: $8,452.14/mo - $14,413.11/mo, NJ: $8,452.14/mo - $13,523.42/mo, OH & VT: $8,452.14/mo - $14,368.63/mo, MN: $8,452.14/mo - $16,637.36/mo, IL & NV: $8,452.14/mo - $17,393.61/mo

Additional Details

Apply for this position