Data Engineer

Implicity
Paris, France
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, French
Experience level
Intermediate
Compensation
€ 60K

Job location

Remote
Paris, France

Tech stack

Java
Agile Methodologies
Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Apache HTTP Server
Azure
Google BigQuery
Software as a Service
Software Quality
Data Architecture
Information Engineering
Data Infrastructure
Data Integrity
ETL
Data Warehousing
Cursor (Graphical User Interface Elements)
DevOps
Hadoop
Monitoring of Systems
Python
PostgreSQL
Online Analytical Processing
Scrum
RabbitMQ
SQL Databases
Data Streaming
TypeScript
Data Processing
Cloud Platform System
Data Ingestion
Fast Healthcare Interoperability Resources
Snowflake
Spark
Technical Debt
Gitlab-ci
Kubernetes
Information Technology
Apache Flink
Performance Monitor
Terraform
Apache Beam
Docker

Job description

  • Build and maintain scalable data ingestion pipelines and ETL/ELT processes (from staging to production).
  • Contribute to the evolution of our data architecture to improve performance, scalability, and reliability.
  • Partner with analytics engineers, data scientists, and business teams to understand and implement data requirements.
  • Support and optimize cloud-based data infrastructure (AWS).
  • Deploy automated data quality checks and monitoring systems to ensure data reliability.
  • Develop and keep up-to-date technical documentation for data processes and systems.
  • Investigate and resolve data-related issues to guarantee data integrity across the stack., The following is our current stack, we don't expect you to be an expert in every single tool.
  • Languages: Python, Java, TypeScript
  • Data Processing: Spark (GLUE), Apache Beam (or Apache Flink)
  • Storage & Table Format: PostgreSQL, Apache Iceberg, S3
  • Integration & Messaging: RabbitMQ
  • Infrastructure & DevOps: AWS, Terraform (IaC), Docker, Kubernetes, GitLab CI
  • Analytics & OLAP System : DBT, Cube, Metabase, Athena
  • Methodology: Agile (Scrum), Lean management, * Work that matters: We provide solutions that directly help doctors improve patient care.
  • High-growth stage: With 100+ people, we are at the perfect size: large enough to have structure, but small enough for your individual impact to be felt every day.
  • Structure for Autonomy: We encourage proactivity but provide the right support through weekly 1:1s and clear quarterly OKRs to help you move in the right direction.
  • Innovation Mindset: We are all focused on constantly improving our product and processes. You will use innovative AI-powered tools to stay ahead and work smarter.
  • Our Values: Our culture is built on Integrity (acting with fairness), Ambition (striving for excellence in healthcare), and Cooperation (supporting each other for collective success).
  • Balance & Culture: We offer a respectful, remote-friendly environment with regular team events to keep us connected.

Requirements

Do you have experience in TypeScript?, Do you have a Master's degree?, * Seniority & Experience: Intermediate level with 3 to 5 years of hands-on experience in data engineering.

  • Sector Experience (Bonus): Prior experience in the Tech or SaaS sector is a plus.
  • Education: Master / Engineer in Computer Science, Engineering, or related field.
  • Languages: Fluent in English and French.

Hard Skills:

  • Core Engineering: Solid SQL & Modeling skills (efficient schemas) and hands-on experience with Python or Java.
  • Cloud Platform : Experience with AWS, GCP, or Azure is required.
  • Data Processing: Proven experience building robust ETL/ELT pipelines (GB/TB scale) with batch / streaming frameworks (e.g. Spark, Apache Beam, Flink, Hadoop) with automated quality checks and proactive monitoring.
  • Orchestration : Experience with Dagster or Airflow (or equivalent).
  • Experience with Lakehouse or DataWarehouse is a plus (Snowflake, BigQuery, Apache Iceberg / S3).
  • Engineering Standards: Ability to apply best practices to build maintainable pipelines while balancing technical debt with feature delivery (balancing speed and code quality).
  • AI usage: We value engineers who use AI-assisted tools (Cursor, Claude, Copilot).
  • Health & Privacy (Bonus): Interest in healthcare data and familiarity with GDPR/HDS. Previous exposure to FHIR is a plus.

Mindset and Soft Skills:

  • Pragmatic & Focused: Able to work in fast-paced environments with a focus on delivering value.
  • Autonomous & Self-driven: Comfortable working independently while contributing effectively to cross-functional teams.
  • Curious & Adaptable: Eager to learn and develop expertise in emerging data technologies.

Benefits & conditions

  • For this job (CDI), you have a base salary depending on your experience between €55k-60k.
  • Eligible for stock option (BSPCEs) according to the company's existing rules

About the company

Implicity is a digital MedTech, that brings outstanding innovations to cardiologists, thanks to Big Data and Artificial Intelligence. Thanks to our leading cardiac remote monitoring platform, it's way easier to manage data and predict patient issues, so that cardiologists can bring the best care at the best time. To put it simply, when you join Implicity, you'll contribute to save lives with us Dr Arnaud Rosier (cardiologist and AI researcher) & David Perlmutter (engineer and entrepreneur), co-founded Implicity in 2016. * 10+ years later, a French Start-Up / Scale-Up is a real game changer in the healthcare market, literally shaping the future of cardiology. * 250+ hospitals / medical centers are already using our solutions, covering 100 000+ patients. At Implicity, you will find the greatest experts in data science, engineering, clinical, regulatory, IT, sales, customers success, etc. working together. This amazing team already managed to make Implicity a clear European leader, and we will very soon do the same in the US market.

Apply for this position