AI Engineer

Insight Global
Woonsocket, United States of America
yesterday

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
$ 114K

Job location

Woonsocket, United States of America

Tech stack

Artificial Intelligence
Databases
Information Engineering
Data Integration
DevOps
Monitoring of Systems
Python
PostgreSQL
MongoDB
Node.js
Data Streaming
Unstructured Data
Google Cloud Platform
React
Large Language Models
Spring-boot
Generative AI
Backend
AI Platforms
Kafka
Data Pipelines
Apache Beam

Job description

We are looking for a talented AI Engineer to join our team. The AI Engineer will be responsible for designing and implementing end-to-end AI solutions, with a strong focus on Retrieval Augmented Generation (RAG) use cases that leverage enterprise data at scale. This role involves building and integrating data pipelines that ingest, ground, and store structured and unstructured data within databases, enabling large language models to effectively complement and reason over that data. The engineer will work across integration, data engineering, and AI development to ensure seamless data flow, high-quality grounding, and scalable, production-ready AI solutions that support business-critical pharmacy operations.

Requirements

  • Strong proficiency with AI agents and large language models, including hands-on experience deploying AI-driven solutions

  • Demonstrated experience implementing Retrieval-Augmented Generation (RAG) solutions at scale

  • Proficiency in backend development using Python as well as Spring Boot and/or Node.js

  • Experience building user interfaces using React

  • Solid DevOps experience, including clean deployment of services across QA, admin, and production environments

  • Experience with monitoring and observability to ensure reliability, performance, and operational health of AI services

  • Strong data engineering skills, with the ability to move, transform, and integrate data across multiple systems and databases

  • Experience working with databases such as MongoDB, PostgreSQL, and Spanner

  • Hands-on experience with Google Cloud Platform (GCP), including use of GCP Dataflow for data pipelines, Cloud Functions, and other GCP-native services

  • Experience leveraging Kafka for event streaming and data integration

  • Familiarity with Gemini or other large language models for production AI use cases

Benefits & conditions

$40-55/hr

Exact compensation may vary based on several factors, including skills, experience, and education.

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.

Apply for this position