Data Engineer

Qode View all jobs
San Francisco, United States of America
9 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 94K

Job location

San Francisco, United States of America

Tech stack

Java
Artificial Intelligence
Algorithmic Trading
Amazon Web Services (AWS)
Computer Vision
Azure
Computer Programming
Data Architecture
Information Engineering
Data Governance
Data Infrastructure
ETL
Data Security
Data Systems
DevOps
Disaster Recovery
Distributed Computing Environment
Distributed Systems
Fault Tolerance
Python
Cloud Services
DataOps
SQL Databases
Data Streaming
Transaction Data
Web Platforms
Data Processing
Google Cloud Platform
System Availability
Snowflake
Spark
Event Driven Architecture
Build Management
Data Lake
Low Latency
Apache Flink
Data Analytics
Kafka
Data Management
Api Design
Stream Analytics
Data Pipelines
Redshift
Databricks

Job description

This role focuses on enabling front-office, advisor, and trading operations through low-latency data pipelines, scalable architectures, and governed data platforms. You will work closely with trading desks, portfolio management, and digital platforms to deliver reliable, compliant, and high-throughput data solutions. Key Responsibilities Trading Data Platform Engineering · Design and build real-time and batch data pipelines supporting trading workflows (orders, executions, positions, market data) · Develop low-latency data processing systems for near real-time decisioning · Build scalable data architectures for high-volume transaction data · Enable event-driven architectures using streaming platforms (Kafka, Kinesis) Wealth Management & Trading Integration · Integrate with trading platforms (OMS/EMS), portfolio systems, and advisor platforms · Support use cases such as: · Trade lifecycle tracking (order * execution * settlement) · Portfolio performance and analytics · Advisor dashboards and client reporting · Ensure data consistency across front-, middle-, and back-office systems Data Engineering & Architecture · Build and manage data lakes / lakehouse architectures (Delta Lake, Iceberg, etc.) · Develop ETL/ELT pipelines using modern frameworks · Design data models optimized for trading and analytics workloads · Implement API-driven data access layers for downstream consumption Performance, Scalability & Reliability · Optimize pipelines for low latency, high throughput, and fault tolerance · Implement data quality, reconciliation, and observability frameworks · Ensure high availability and disaster recovery for critical trading data systems Governance, Risk & Compliance · Implement data governance, lineage, and auditability · Ensure compliance with regulatory requirements (SEC, FINRA, etc.) · Enable data security, entitlements, and access controls · Support trade surveillance and reporting requirements Collaboration & Delivery · Partner with trading desks, product teams, and architects to translate requirements into scalable data solutions · Work closely with AI/analytics teams to enable downstream insights and models · Mentor junior engineers and contribute to data engineering best practices, As a Senior DevOps Engineer, you will play a crucial role in shaping the future of AI systems by designing and maintaining scalable infrastructure solutions. Your expertise will di…

  • Just now
  • Apply easily, Role Overview As a Machine Vision Engineer, you will leverage advanced computer vision techniques to tackle real-world challenges and enhance intelligent automation systems. This r…
  • 1 day ago
  • Apply easily

Remote Network Engineer with Programming for Data & AI Systems SaidGig

  • San Francisco, CA

  • $60.00 per hour This role focuses on the integration of network engineering and programming to enhance data-driven intelligent systems. As a key contributor, you will help structure, label, and op…

  • 2 days ago

  • Apply easily

Requirements

· 7-12+ years of experience in data engineering or backend engineering · Strong expertise in: · Python / Scala / Java · SQL and distributed data processing (Spark, Flink, etc.) · Hands-on experience with: · Streaming platforms (Kafka, Kinesis, Pulsar) · Data lake / warehouse technologies (Snowflake, Databricks, Redshift) · Experience building real-time or near real-time data pipelines · Strong understanding of data modeling and large-scale distributed systems Preferred Qualifications · Experience in Wealth Management or Capital Markets trading systems · Familiarity with OMS/EMS platforms (e.g., Charles River Development, Aladdin, FIS) · Knowledge of market data (equities, fixed income, derivatives) and trade lifecycle / post-trade processing · Experience with cloud-native data platforms (AWS, Azure, GCP) · Exposure to real-time analytics and risk systems

Benefits & conditions

  • $45.00 per hour

About the company

© 2026 Careerjet All rights reserved

Apply for this position