Senior Data & Platform Engineer
Cryptonext Security
9 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English, French Experience level
SeniorJob location
Tech stack
API
Amazon Web Services (AWS)
Software Quality
Databases
Information Engineering
Data Governance
ETL
Data Retention
Data Security
Data Structures
Elasticsearch
PostgreSQL
Performance Tuning
Prometheus
Data Streaming
Grafana
Backend
Kafka
Stream Processing
ELK
Microservices
Job description
- Define and maintain the end-to-end data model for COMPASS (entities, relationships, vocabulary, lifecycle).
- Translate product requirements into robust database schemas, indexing strategies, and normalization/denormalization choices.
- Ensure semantic consistency and data governance rules across all services., * Design, validate, and maintain ingestion pipelines using Kafka, ensuring data correctness and consistency.
- Define the canonical formats exchanged across Kafka (ingestion), PostgreSQL (storage), and Elasticsearch (search).
- Validate and standardize data structures produced by probes, sensors, and backend services.
Backend Collaboration & API Contracts
- Work closely with backend teams to design API-level data contracts aligned with the data model.
- Optimize queries, indexes, and persistence strategies for real-time workloads and analytics., You will work closely with:
- The R&D engineering team building CryptoNext's cryptographic and observability products.
- Backend engineers designing APIs and distributed services.
- The Head of Engineering and CTO for architecture, strategy, and long-term product evolution.
- Product and QA teams to ensure data quality, reliability, and consistency across the platform.
You will be the reference person for all data-related decisions and architecture within the COMPASS squad., * Technical Interview with a senior engineer and the Head of Engineering (Kafka, PostgreSQL, ELK, data modeling).
- CTO Meeting - deep dive on architecture, COMPASS roadmap, and cultural fit.
- Offer & onboarding - welcome to CryptoNext!
Requirements
Do you have experience in Schematics?, Do you have a Master's degree?, * 15+ years of experience in software or data engineering in a distributed or complex environment.
- Master's degree from a French engineering school or European equivalent.
- Deep expertise in:
- Data modeling & relational schema design
- PostgreSQL administration & performance optimization
- Kafka (setup, topics/partitions, consumer groups, schema registry, monitoring)
- Elasticsearch / ELK stack
- Strong hands-on experience with:
- Stream processing architectures
- ETL/ELT workflows
- Data lifecycle management
- Proven ability to design data flows with clean, contract-based schemas.
- Strong commitment to code quality, documentation, and operational reliability.
- Fluency in English., * Experience with ClickHouse, Grafana, or Prometheus.
- Understanding of secure data handling and cryptographic components.
- Familiarity with AWS/GCP or hybrid on-prem architectures.
- Background in cybersecurity, observability, or high-throughput systems.
- Experience with large-scale data retention strategies and analytics workloads.
About the company
CryptoNext Security is a deeptech startup specializing in post-quantum cryptography. We help businesses and institutions prepare for a world where quantum computers will challenge today's security standards.
Our clients include major financial institutions, defense actors, telecom operators, and global enterprises seeking to future-proof their cryptography., What's in it for you at CryptoNext
* Join a deeptech company at the forefront of post-quantum cybersecurity.
* Build the core data infrastructure of a cutting-edge observability platform.
* Collaborate with world-class cryptographers and software engineers.
* Flexible hybrid/remote policy and office in central Paris.
* Competitive compensation aligned with seniority and expertise.
* Direct impact on a product used by major financial and defense clients.
* A chance to shape the future of quantum-safe data engineering.