C# .NET Developer | High-Volume Data
Role details
Job location
Tech stack
Job description
- Design and build high-performance backend systems using C# and .NET Core
- Develop scalable, event-driven services within a Kafka streaming architecture
- Engineer systems capable of handling large volumes of structured and time-series data
- Write efficient, multithreaded C# code optimised for throughput and performance
- Contribute to system architecture, scalability, and resilience decisions
- Deliver end-to-end functionality from analysis and design through to release
- Implement clean coding standards (SOLID, IoC, automated testing)
- Support CI/CD pipelines, Azure deployments, and production monitoring
- Collaborate closely with Trading and Quant teams on performance-sensitive solutions
- Mentor and review code where required
Requirements
- Strong commercial experience as a C# .NET Developer
- Deep understanding of .NET Core / .NET 5+ backend development
- Experience building high-throughput backend services or APIs
- Exposure to distributed systems, concurrency, or multithreading
- Experience working with relational databases (SQL Server, Postgres, etc.)
- Exposure to Azure cloud environments
- Strong understanding of object-oriented design and modern engineering practices
- Confident communicator comfortable working across technical and business teams
Desirable Experience
- Kafka or other event-streaming technologies
- Large-scale data environments (streaming, data lakes, time-series systems)
- Docker & Kubernetes
- Observability and monitoring tools
- Experience within energy, commodities, or financial trading environments
Benefits & conditions
This is a C# .NET-focused engineering role working in a high-volume, high-frequency data environment, building modern distributed systems that process and stream large-scale market data in real time.
You will be joining a technology-driven trading business that invests heavily in modern engineering practices, scalable architecture, and cloud-native infrastructure.
The Team
The Data Platform team sits at the core of the organisation's trading ecosystem. They design and build the systems responsible for:
- Ingesting and streaming high-frequency market data
- Storing and processing large-scale time-series datasets
- Distributing real-time data across trading and analytics platforms
- Enabling advanced quantitative and trading strategies
The environment is event-driven, Kafka-based, and built on modern .NET Core and Azure infrastructure. The team works closely with Trading, Quantitative Analytics, and Production Operations to deliver performance-critical systems that directly impact trading performance.
Role Details
Up to £95,000 + 20-30% Bonus + Benefits
Tech Stack: C# | .NET Core (5+) | Kafka | Azure | SQL Server | Docker | Kubernetes
Hybrid working (2-3 days per week onsite)