Data Engineer

One Resources
Sheffield, United Kingdom
9 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 135K

Job location

Sheffield, United Kingdom

Tech stack

Batch Processing
Serialization
Protocol Buffers
Hadoop
JSON
Python
Performance Tuning
Software Engineering
Data Streaming
Parquet
Spark
Build Management
Avro
Kafka

Job description

We have an exciting opportunity now available with one of our tier one banking clients! They are currently looking for a skilled Data Engineer, Kafka and Hadoop Expert(Python) to join their team for a seven-month contract. Job Responsibilities/Objectives:

  • Design and build Kafka-based streaming applications (Kafka Streams/ksqlDB) in Scala/Python for transformation, enrichment, and routing.
  • Implement end-to-end streaming pipelines: producers, stream processors, and consumers with strong data quality, idempotency, and DLQ patterns.
  • Model topics, schemas, and contracts (Avro/Protobuf/JSON) and maintain backward/forward compatibility.
  • Develop batch/stream interoperability: Spark/Structured Streaming jobs for aggregation, feature generation, and storage in Parquet/ORC.

Requirements

  • Kafka application development: Kafka Streams/ksqlDB, producer/consumer patterns, partitioning/serialization, exactly-once/at-least-once semantics.
  • Languages: Strong in Scala and/or Python for streaming apps; familiarity with testing frameworks and CI for stream processors.
  • Schema management: Avro/Protobuf/JSON, schema registry usage, compatibility strategies.
  • Stream/batch processing: Spark (including Structured Streaming), Parquet/ORC, partitioning/bucketing, performance tuning.

Apply for this position