Backend AI Engineer

Lever, Inc.
Cleveland, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Cleveland, United States of America

Tech stack

API
Artificial Intelligence
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Google BigQuery
ETL
Distributed Systems
Python
Online Analytical Processing
Parsing
Software Engineering
Data Streaming
Unstructured Data
Large Language Models
Snowflake
Spark
Backend
Core Data
Information Technology
Apache Flink
Real Time Data
Kafka
Microservices

Job description

Role Overview Move beyond traditional ETL. At BrightEdge, we are rebuilding our core data infrastructure into an AI-augmented processing engine. We're looking for a Senior Backend Engineer to architect "intelligent" systems that don't just move data, but understand it. You will build self-healing pipelines and agentic workflows that allow our platform to autonomously adapt to complex, shifting data landscapes. What You'll Do Architect Agentic Pipelines: Build Python-based backend systems that utilize LLMs and AI agents to automate the discovery, parsing, and mapping of unstructured data. Build Self-Healing Infrastructure: Develop autonomous reliability layers that diagnose upstream failures, adapt to API schema changes, and perform real-time data validation without manual intervention. Optimize High-Velocity Analytics: Design and manage low-latency data flows into ClickHouse and BigQuery, ensuring our AI-driven insights are backed by high-performance OLAP architecture. Scale Intelligent Connectors: Create robust integrations (APIs, Streaming, S3) that leverage AI to handle diverse, non-standardized data sources at massive scale. Engineering Leadership: Drive best practices in AI-assisted testing and observability, mentoring the team on the intersection of traditional backend stability and modern AI capabilities.

Requirements

5+ years experience with Bachelor's degree in Computer science, Software Engineering Python & AI Mastery: Deep expertise in production-level Python. You've successfully integrated AI models/LLMs into backend services or data workflows. OLAP Power User: Proven experience with ClickHouse, BigQuery, or Snowflake, specifically in schema optimization and materialized views for large-scale datasets. Distributed Systems Expertise: Strong background in cloud environments (GCP/AWS) and real-time streaming tools like Kafka, Flink, or Spark. Adaptive Mindset: You don't just build to a spec; you build systems that anticipate and recover from environment changes autonomously. Product Ownership:A track record of leading high-impact initiatives and a desire to own the "brain" of a global data platform

Apply for this position