Data Engineer

Infinity Tech Group Inc
Princeton, United States of America
yesterday

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Princeton, United States of America

Tech stack

Java
Google BigQuery
Code Review
Continuous Integration
Information Engineering
Data Security
Data Warehousing
Database Queries
Fraud Prevention and Detection
Python
Machine Learning
Performance Tuning
Role-Based Access Control
Data Streaming
Snowflake
Spark
Data Lake
Debezium
Information Technology
Kafka
Machine Learning Operations
Stream Processing
Data Pipelines
Redshift
Databricks

Job description

We re looking for a Data Engineer to design, build, and maintain reliable data pipelines and platforms that power analytics, reporting, and data products across a Property & Casualty (P&C) auto insurance business. You ll partner with actuarial, underwriting, claims, product, and finance teams to deliver trusted, well-governed data that supports retention, loss modelling, fraud detection, and customer experience., * Build and maintain scalable batch and streaming data pipelines (ingestion, transformation, validation, and delivery).

  • Integrate data from core insurance systems (policy administration, billing, claims, Contact management system).
  • Model and curate analytical datasets (e.g., policy, quote, exposure, premium, loss, claim, payment, reserve, subrogation, salvage).
  • Implement data quality checks, anomaly detection, and reconciliation for critical metrics
  • Develop and maintain a data lake/warehouse (schema design, partitioning, performance tuning, cost optimization).
  • Collaborate with governance, security, and compliance teams to implement access controls, PII handling, retention, and auditability.
  • Operate data workflows in production: monitoring, alerting, incident response, SLAs, and root-cause analysis.
  • Contribute to engineering best practices: CI/CD, infrastructure-as-code, testing, code reviews, and documentation.
  • Perform other duties as assigned.

Requirements

  • Bachelor's degree in computer science or technology related is preferred.
  • 5+ years of experience in data engineering, Machine learning, with strong SQL skills and experience building dimensional or analytical models.
  • Proficiency in at least one programming language: Python, Scala, or Java.
  • Experience with modern data warehouses/lakehouses (e.g., Snowflake, BigQuery, Redshift, Databricks, Spark).
  • Familiarity with data quality/testing (e.g., dbt tests, Great Expectations) and observability/monitoring patterns.
  • Understanding of data security fundamentals (PII, encryption, role-based access control).

Preferred

  • Experience in insurance, financial services, or other regulated industries
  • Knowledge of P&C auto insurance concepts: underwriting, rating/pricing, claims lifecycle, reserves, exposure, reinsurance is a plus
  • Knowledge with event/streaming systems (Kafka, Kinesis, Pub/Sub) and CDC tools (Debezium, Fivetran/HVR, etc.).
  • Exposure to MLOps/data needs for fraud detection, pricing, or claims severity models is a plus.

Benefits & conditions

  • Comprehensive health benefits including medical, dental and vision coverage
  • Generous paid time off (PTO days, sick days, and holidays)
  • Flexible spending options with FSA & HSA plans
  • Life and AD&D insurance
  • 401(k) with company match

Apply for this position