AI Data Engineer - GCP

Eliassen Group
Lincoln, United States of America
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 125K

Job location

Remote
Lincoln, United States of America

Tech stack

API
Artificial Intelligence
Amazon Web Services (AWS)
JIRA
Unit Testing
Google BigQuery
Cloud Computing
Cloud Database
Data Validation
Information Engineering
ETL
Graph Database
Python
SQL Databases
Workflow Management Systems
Enterprise Software Applications
Large Language Models
Multi-Agent Systems
Prompt Engineering
Data Pipelines
Jenkins

Job description

  • Build and operationalize LLM-based multi-agent workflows that read Jira stories and interpret requirements, generate SQL, Python, and Dataform-ready code, perform automated data validation and unit testing, and prepare artifacts for deployment and raise pull requests.
  • Apply agent patterns such as planners, evaluators, and tool-using agents to automate engineering tasks.
  • Integrate LLM agents with BigQuery, Python, ETL/ELT pipelines, and internal frameworks.
  • Automate repetitive development and testing steps to reduce manual engineering work.
  • Collaborate with teams exploring VS Code Copilot, Auto Model, Cloud Model Sonnet, Gemini MCP, and Claude Code.
  • Support multiple Orion squads including SKU, Customer, Fulfillment, Stores, DCs, and Website, and contribute to future program waves.

Requirements

Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance., * 5+ years total engineering experience.

  • Strong Python development background.
  • Hands-on experience with LLM agents, multi-agent frameworks, or AI orchestration patterns.
  • Strong understanding of prompt engineering, context management, and agent-based task decomposition.
  • Solid data engineering experience across ETL/ELT pipelines, data modeling, orchestration tools, and cloud data ecosystems.
  • Experience with GCP and BigQuery, or AWS with some BigQuery exposure.
  • Ability to integrate LLM agents with data pipelines, APIs, and enterprise systems.

Preferred Skills:

  • Exposure to VS Code Copilot
  • Exposure to Auto Model (preferred).
  • Exposure to Cloud Model Sonnet
  • Exposure to Gemini MCP
  • Exposure to Claude Code
  • Familiarity with knowledge graph concepts
  • Experience with Dataform or Jenkins-based orchestration

Benefits & conditions

Skills, experience, and other compensable factors will be considered when determining pay rate. The pay range provided in this posting reflects a W2 hourly rate; other employment options may be available that may result in pay outside of the provided range.

W2 employees of Eliassen Group who are regularly scheduled to work 30 or more hours per week are eligible for the following benefits: medical (choice of 3 plans), dental, vision, pre-tax accounts, other voluntary benefits including life and disability insurance, 401(k) with match, and sick time if required by law in the worked-in state/locality.

If anyone reaches out to you about an open position connected with Eliassen Group, please ensure that you are working directly with us by confirming the following:

· When you work with Eliassen Group, all email communication will come from an Eliassen.com address, never Gmail, Yahoo, etc.

About the company

Eliassen Group is a strategic consulting firm that helps organizations reach further and achieve more through our technology, business advisory, and life sciences solutions. For nearly 40 years, we have combined exceptional people, deep domain expertise, and intelligent capabilities to expand our clients' capacity and accelerate meaningful outcomes. We are driven by a purpose to positively impact the lives of our employees, clients, consultants, and the communities we serve.

Apply for this position