Bring AI to your SQL queries

Spice transforms SQL into your interface for AI. Call models like OpenAI, Anthropic, or Bedrock with the AI() SQL function to generate text, classify data, and enrich results.

LLM Inference_ Header2

AI where your data lives

Generate insights, build autonomous agents, and enrich results—directly from the tools and SQL workflows you already rely on.

Platform-LLM_Benefits_Accelerate data analysis

Accelerate data analysis

Call large language models inline using SQL functions to summarize, translate, or classify data with no external APIs or glue code.

Learn more

Platform-LLM_Benefits_Combine data and AI in one workflow

Combine data and AI
in one workflow

Avoid context switching and integrate AI with standard SQL operations. Chain model responses to filters, joins, or aggregations to build RAG pipelines.

Learn more

Platform-LLM_Benefits_Maintain data governance and security

Maintain data governance
& security

All AI-driven operations are performed within your governed SQL environment, so data never leaves your compliance boundaries and access is fully auditable.

Learn more

Trusted by developers building production AI

Teams use Spice to integrate AI directly into SQL workflows—accelerating development and eliminating redundant pipelines.

Homepage_Logos_NRC
Homepage_Logos_BasisSet
gradient overlayTim-Ottersburg

"Partnering with Spice AI has transformed how NRC Health delivers AI-driven insights. By unifying siloed data across systems, we accelerated AI feature development, reducing time-to-market from months to weeks - and sometimes days. With predictable costs and faster innovation, Spice isn't just solving some of our data and AI challenges - it’s helping us redefine personalized healthcare.”

Tim Ottersburg

VP of Technology, NRC Health

gradient overlayRachel-WongWEB

“Spice AI grounds AI in our actual data, using SQL queries across all our data. This brings accuracy to probabilistic AI systems, which are very prone to hallucinations.”

Rachel Wong

CTO, Basis Set

Integrations across all of your data sources

Built-in connectors for 30+ modern and legacy sources, from Databricks and S3 to MySQL and PostgreSQL, with provider-agnostic support for major LLM APIs.

Platform-Query_Illustration

FAQs

Answers to common questions about calling LLMs from SQL

What is the AI() function?

AI() is a built-in Spice function that lets you call large language models directly inside SQL queries. It takes a prompt (and optional data columns) as input and returns model completions as query results. This allows you to summarize, translate, generate, or classify text inline without additional code or API management.

How does text-to-SQL work?

Spice uses your preferred LLM to convert prompts into executable SQL. Results are constrained to your connected datasets and subject to all existing SQL permissions and governance rules.

Can I use different model providers?

Yes. Spice abstracts model providers behind a common interface; select OpenAI, Anthropic, Bedrock, or your custom model by name in each call. This keeps your SQL portable and futureproof.

See Spice in action

Get a guided walkthrough of how development teams use Spice to query, accelerate, and integrate AI for mission-critical workloads.

Get a demo

content stat graphiccontent stat graphiccontent stat orb