Senior Data Engineer

Kong Inc.
San Francisco, United States of America
2 days ago

Role details

Contract type
Permanent contract
Employment type
Part-time (≤ 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 175K

Job location

Remote
San Francisco, United States of America

Tech stack

Adaptable Database Systems
API
Artificial Intelligence
Airflow
Data analysis
ARM
Information Engineering
Data Governance
Data Infrastructure
ETL
Data Transformation
Data Warehousing
Document-Oriented Databases
Python
Machine Learning
Operational Databases
Query Optimization
Role-Based Access Control
Cloud Services
Standard Sql
Systems Integration
AI Infrastructure
Large Language Models
Snowflake
Multi-Agent Systems
Prompt Engineering
Generative AI
Data Layers
AI Platforms
Low Latency
Operational Systems
GPT
Data Pipelines

Job description

We're looking for a Senior Data & AI Engineer to join our Revenue Analytics team. In this role, you'll own both the data infrastructure and AI systems that power revenue insights and intelligent experiences across the business. You'll be responsible for building and maintaining reliable pipelines and scalable data models and determining how we connect those data assets to AI tools - which solutions we invest in, and how we do it securely and cost-effectively.

You'll sit at the intersection of data engineering, AI/ML, and platform architecture, with high visibility and real impact on our long-term data and AI strategy.

What You'll Do

Data Infrastructure & Pipelines

  • Design, build, and maintain ETL/ELT pipelines using Fivetran + Snowflake integrations to ingest data from a variety of sources into our Snowflake data warehouse
  • Develop and manage robust data models in Snowflake, ensuring data is structured for performance, reliability, and ease of use by analysts and business stakeholders
  • Use Hightouch to operationalize data by syncing warehouse data to downstream CRM, marketing, and sales tools
  • Monitor pipeline health, troubleshoot data quality issues, and implement alerting to proactively catch failures
  • Document data models, pipelines, and lineage to support a culture of data literacy and self-service analytics

AI Infrastructure & Integrations

  • Integrate Claude and other LLMs directly with our Snowflake data warehouse, enabling AI-powered querying, summarization, and insight generation on top of live revenue data
  • Build and maintain data sources, semantic layers, and search services within Snowflake Cortex and connected AI platforms
  • Design and deploy AI agents that can reason over structured and unstructured revenue data to support go-to-market workflows

Agent Orchestration

  • Architect and manage multi-step agent workflows, coordinating across tools, APIs, and data sources to automate complex analytical and operational tasks
  • Evaluate and implement orchestration frameworks (e.g., LangChain, LlamaIndex, or custom solutions) best suited to our use cases

AI Strategy & Evaluation

  • Run rigorous evaluations of AI tools, models, and platforms to determine the best solution for each use case (e.g., Snowflake Cortex vs. Claude vs. Gemini vs. custom fine-tuned models)
  • Develop evaluation frameworks covering quality, latency, cost, and security to inform build vs. buy decisions and guide our overall AI roadmap
  • Stay current on the rapidly evolving AI landscape and proactively recommend new tools or approaches as the space matures

Security & Governance

  • Implement and manage row-level security (RLS) in Snowflake to ensure AI tools only surface data that users are authorized to see
  • Maintain and evolve role-based access controls (RBAC) alongside new RLS policies
  • Contribute to data governance practices, including access controls, PII handling, and schema management
  • Partner with Data, Security, and Legal teams to establish AI data governance standards and guardrails

Cost Management & Optimization

  • Monitor and manage AI credit consumption across Snowflake Cortex, API usage, and other platforms to keep spending within budget
  • Identify and implement optimizations - such as caching, prompt tuning, model selection, and query efficiency improvements - to reduce cost without sacrificing quality
  • Build reporting to give stakeholders visibility into AI spend and usage trends

Cross-Functional Partnership

  • Partner with Revenue Operations, Finance, and Sales to understand data needs and translate them into scalable engineering solutions
  • Collaborate across technical and non-technical teams to deliver data and AI solutions that directly influence revenue strategy

Requirements

If you don't think you meet all of the criteria below but are still interested in the job, please apply. Nobody checks every box - we're looking for candidates that are particularly strong in a few areas, and have some interest and capabilities in others., * 3+ years of experience in a data engineering, analytics engineering, or AI/ML engineering role

  • Hands-on experience with Snowflake, including data modeling, query optimization, Cortex Analyst, Cortex Search, semantic layers, and security model (RBAC, RLS)
  • Proficiency with Fivetran for pipeline orchestration and connector management
  • Experience with Hightouch or similar reverse ETL tools for syncing data to operational systems
  • Experience integrating LLMs (Claude, GPT-4, or similar) into production data workflows via API
  • Familiarity with agent orchestration frameworks and patterns (e.g., LangChain, LlamaIndex, CrewAI, or custom implementations)
  • Strong understanding of AI/LLM evaluation methodologies - you know how to measure whether an AI solution is actually working
  • Experience with prompt engineering, retrieval-augmented generation (RAG), and/or fine-tuning
  • Strong SQL and Python skills
  • A security-first mindset with experience managing data access controls in cloud data platforms
  • Strong communication skills and the ability to collaborate across technical and non-technical teams

Nice To Have

  • Experience with Anthropic's Claude API or Claude for Enterprise
  • Familiarity with vector databases (e.g., Pinecone, Weaviate, or Snowflake's native vector support)
  • Experience with dbt for data transformation
  • Experience with a workflow orchestration tool such as Airflow or Prefect
  • Familiarity with data observability tools (e.g., Monte Carlo, Metaplane)
  • Background supporting Sales, Finance, or Revenue Operations use cases
  • Experience building internal AI tools or copilots for business teams

Benefits & conditions

  • Shape our data and AI strategy from the ground up - this is a greenfield opportunity with real ownership
  • Work on data that directly influences revenue strategy and business decisions
  • Work at the cutting edge of enterprise AI on a team that's fully bought in on building with AI responsibly
  • High cross-functional visibility across Revenue, Finance, Product, and Engineering
  • Competitive salary, equity, and benefits

About the company

Kong Inc., a leading developer of API and AI connectivity technologies, is building the infrastructure that powers the agentic era. Trusted by the Fortune 500 and startups alike, Kong's unified API and AI platform, Kong Konnect, enables organizations to secure, manage, accelerate, govern, and monetize the flow of intelligence across APIs and AI models. For more information, visit www.konghq.com. Compensation Range: $125K - $175K About Kong 201-500

Apply for this position