AI Data Engineer - Cortex AI
Role details
Job location
Tech stack
Job description
Reposted 19 Hours Ago Remote Hiring Remotely in US Mid level Remote Hiring Remotely in US Mid level Design and build modern data pipelines and AI-ready platforms using Snowflake and AWS services, ensuring high-quality, accessible data for analytics and ML applications. The summary above was generated by AI About TechTorch, As an AI Data Engineer, you will design and build modern data pipelines and AI-ready data platforms with Snowflake as the core warehouse, leveraging Snowflake Cortex AI, AWS Bedrock, and other mainstream AI services. Your work will enable advanced analytics, LLM-powered use cases, and AI-driven automation across enterprise environments.
This role sits at the intersection of data engineering and applied AI. You will ensure data is high-quality, well-modeled, and accessible for downstream analytics, machine learning, and GenAI applications.
What You'll Do
- Design, build, and maintain scalable data pipelines using Snowflake as the central data platform
- Develop AI-ready data models and feature layers to support analytics, ML, and GenAI use cases
- Leverage Snowflake Cortex AI for embedding, classification, summarization, and AI-assisted analytics
- Integrate and operationalize AI workflows using AWS Bedrock and related AWS services (e.g., Lambda, Step Functions)
- Build and optimize ELT pipelines using tools such as dbt, SQL, and Python
- Integrate data from diverse sources including APIs, SaaS platforms, databases, and event streams
- Ensure data quality, observability, and governance across pipelines and AI workloads
- Collaborate with AI engineers, data scientists, and business teams to translate use cases into scalable data solutions
- Document data models, pipelines, and AI-related design decisions clearly for long-term maintainability, * Opportunity to work on AI-first data platforms using Snowflake Cortex and AWS Bedrock
- High-impact role at the intersection of data engineering and applied AI
- Exposure to private equity-backed enterprise transformation programs
- Global, collaborative team with strong technical standards
- Flexible, remote-first working environment with autonomy and ownership, The Senior Reporting & Analytics Consultant leads customer-focused reporting and analytics engagements, developing solutions, supporting implementations, and training clients, while ensuring project success., The Sr. Director of Technical Program Management will lead cross-functional programs, drive strategy and execution of large-scale projects, and oversee a team of technical program managers at Capital One., Artificial Intelligence * Fintech * Information Technology * Logistics * Payments * Business Intelligence * Generative AI The Lead Technical Architect will implement solutions for customers, oversee integration projects, enhance customer experience, and mentor new team members. Top Skills: NetSuiteOraclePeoplesoftRestful ApisSAPSftpSoap
What you need to know about the Colorado Tech Scene
With a business-friendly climate and research universities like CU Boulder and Colorado State, Colorado has made a name for itself as a startup ecosystem. The state boasts a skilled workforce and high quality of life thanks to its affordable housing, vibrant cultural scene and unparalleled opportunities for outdoor recreation. Colorado is also home to the National Renewable Energy Laboratory, helping cement its status as a hub for renewable energy innovation.
Key Facts About Colorado Tech
- Number of Tech Workers: 260,000; 8.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Lockheed Martin, Century Link, Comcast, BAE Systems, Level 3
- Key Industries: Software, artificial intelligence, aerospace, e-commerce, fintech, healthtech
- Funding Landscape: $4.9 billion in VC funding in 2024 (Pitchbook)
- Notable Investors: Access Venture Partners, Ridgeline Ventures, Techstars, Blackhorn Ventures
- Research Centers and Universities: Colorado School of Mines, University of Colorado Boulder, University of Denver, Colorado State University, Mesa Laboratory, Space Science Institute, National Center for Atmospheric Research, National Renewable Energy Laboratory, Gottlieb Institute
Requirements
- 4+ years of experience in data engineering, with strong hands-on Snowflake experience
- Practical experience building AI-enabled data solutions or preparing data for ML/LLM workloads
- Strong proficiency in SQL and Python for data transformation and pipeline development
- Experience with Snowflake Cortex AI or similar warehouse-native AI capabilities
- Hands-on experience with AWS, ideally including AWS Bedrock, Lambda, S3, and IAM
- Experience with ELT tooling such as dbt, Airflow, or similar orchestration frameworks
- Solid understanding of data modeling, data warehousing, and performance optimization
- Comfortable working in cloud-native, enterprise environments with high delivery expectations
- Strong communication skills and ability to collaborate across technical and business teams
Nice to Have
- Experience with vector databases or embedding workflows (e.g., Snowflake, OpenSearch, Pinecone)
- Familiarity with LLM orchestration frameworks (e.g., LangChain, Bedrock Agents)
- Experience supporting AI agents, RAG pipelines, or GenAI analytics use cases
- Exposure to regulated or security-conscious environments