Senior Data Engineer
Role details
Job location
Tech stack
Job description
Portfolio managers will be required. Responsibilities: Building Data pipelines in AWS platform utilizing existing tools like Cron, Glue, Eventbridge, Python based ETL, AWS Redshift Working closely with data vendors such as Bloomberg, Refinitiv, Exchanges, Spiderrock, Socgen etc. Normalizing/standardizing vendor data, firm data for firm consumption Help Support and Expand Platform Capabilities from basic daily/historical processing to products, private data storage. Coordinate with Internal teams on delivery, access, requests, support Promote Data Engineering best practices and mentor junior team members Conduct architectural and design reviews and establish best practices for data engineering, mentor and guide data engineers and contribute to hiring and technical development of the global team. Required Skills and Qualifications: Bachelor's degree in computer science, Engineering, Mathematics or related field 10+ experience in a similar role 5+ years of hands-on Java development experience
Requirements
(Java 11 or higher preferred) Prior buy side experience is strongly preferred (Multi-Strat/Hedge Funds) Capital markets experience is necessary with good working knowledge of reference data across asset classes and experience with trading systems Prior experience in low-latency ingestion pipelines, in-memory and persistent storage layers Deep AWS cloud experience with common services such as S3, lambda, cron, Event Bridge, etc. Proven experience in designing and deploying data infrastructure at scale preferably in a financial services firm or hedge fund. Experience designing and deploying disaster recovery and high availability strategies in cloud environments. Strong hands-on skills with NoSQL and SQL databases, programming in Python and data pipeline and analytics tools Familiarity with time series data and common market data sources (Bloomberg, Refinitiv etc.) Familiar with modern Dev Ops practices and infrastructure-as-code tools (e.g. Terraform, CloudFormation) Familiarity working with fast moving data is a preferred Exposure to Chronicle libraries, Reactive Streams, or Disruptor pattern is a plus Expert knowledge of Java concurrency, NIO, JVM tuning, and lock-free data structures. Excellent communication skills to work with stakeholders across technology, investment, and operations teams. Ability to work in a fast-paced environment with tight deadlines.