Skip to content

Providers Overview

llmist supports multiple LLM providers out of the box with automatic discovery and seamless switching.

llmist automatically discovers available providers from environment variables:

Terminal window
# Set one or more API keys
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GEMINI_API_KEY=...
import { LLMist } from 'llmist';
// Providers are auto-discovered from environment
const client = new LLMist();
// Use any available model
const agent = client.createAgent()
.withModel('sonnet') // Uses Anthropic
.ask('Hello!');

Use friendly aliases instead of full model names:

AliasFull ModelProvider
gpt5openai:gpt-5OpenAI
gpt5-miniopenai:gpt-5-miniOpenAI
sonnetanthropic:claude-sonnet-4-5Anthropic
haikuanthropic:claude-haiku-4-5Anthropic
opusanthropic:claude-opus-4-5Anthropic
flashgemini:gemini-2.5-flashGemini
progemini:gemini-3-pro-previewGemini

You can also specify the provider explicitly:

// With provider prefix
.withModel('openai:gpt-5')
.withModel('anthropic:claude-sonnet-4-5-20250929')
.withModel('gemini:gemini-2.5-flash')

For advanced use cases, configure providers manually:

import { LLMist, OpenAIChatProvider } from 'llmist';
const client = new LLMist({
autoDiscoverProviders: false,
adapters: [
new OpenAIChatProvider({
apiKey: process.env.MY_OPENAI_KEY,
baseUrl: 'https://my-proxy.com/v1',
}),
],
});