Skip to content

Providers

llmist supports multiple LLM providers out of the box.

ProviderEnv VariablePrefix
OpenAIOPENAI_API_KEYopenai:
AnthropicANTHROPIC_API_KEYanthropic:
Google GeminiGEMINI_API_KEYgemini:

llmist automatically discovers providers based on environment variables:

Terminal window
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GEMINI_API_KEY="..."
const client = new LLMist();
.withModel('gpt-5') // OpenAI (auto-detected)
.withModel('claude-sonnet-4-5') // Anthropic (auto-detected)
.withModel('gemini-2.5-flash') // Gemini (auto-detected)

Use provider:model format:

.withModel('openai:gpt-5')
.withModel('anthropic:claude-sonnet-4-5-20250929')
.withModel('gemini:gemini-2.5-flash')
import { LLMist, OpenAIChatProvider, AnthropicMessagesProvider } from 'llmist';
const client = new LLMist({
autoDiscoverProviders: false,
adapters: [
new OpenAIChatProvider({ apiKey: 'sk-...' }),
new AnthropicMessagesProvider({ apiKey: 'sk-ant-...' }),
],
defaultProvider: 'openai',
});
interface ProviderAdapter {
readonly providerId: string;
readonly priority?: number;
supports(model: ModelDescriptor): boolean;
stream(options: LLMGenerationOptions, descriptor: ModelDescriptor): LLMStream;
getModelSpecs?(): ModelSpec[];
countTokens?(messages: LLMMessage[], descriptor: ModelDescriptor): Promise<number>;
}