llmist supports multiple LLM providers with convenient aliases for quick access.
Use short aliases instead of full model names:
| Alias | Full Model Name | Provider |
|---|
gpt5 | gpt-5 | OpenAI |
gpt5-mini | gpt-5-mini | OpenAI |
gpt4o | gpt-4o | OpenAI |
gpt4-turbo | gpt-4-turbo | OpenAI |
o3-mini | o3-mini | OpenAI |
sonnet | claude-sonnet-4-5 | Anthropic |
opus | claude-opus-4-5 | Anthropic |
haiku | claude-haiku-4-5 | Anthropic |
flash | gemini-2.5-flash | Google |
pro | gemini-3-pro-preview | Google |
For explicit provider selection, use the provider:model format:
# Explicit provider selection
bunx @llmist/cli complete "Hello" --model openai:gpt-5
bunx @llmist/cli complete "Hello" --model anthropic:claude-sonnet-4-5
bunx @llmist/cli complete "Hello" --model gemini:gemini-2.5-flash
# Explicit provider selection
npx @llmist/cli complete "Hello" --model openai:gpt-5
npx @llmist/cli complete "Hello" --model anthropic:claude-sonnet-4-5
npx @llmist/cli complete "Hello" --model gemini:gemini-2.5-flash
| Model | Vision | Streaming | Tool Use | Context |
|---|
| GPT-5 | ✓ | ✓ | ✓ | 128K |
| GPT-5 Mini | ✓ | ✓ | ✓ | 128K |
| GPT-4o | ✓ | ✓ | ✓ | 128K |
| Claude Opus 4.5 | ✓ | ✓ | ✓ | 200K |
| Claude Sonnet 4.5 | ✓ | ✓ | ✓ | 200K |
| Claude Haiku 4.5 | ✓ | ✓ | ✓ | 200K |
| Gemini Flash | ✓ | ✓ | ✓ | 1M |
| Gemini Pro | ✓ | ✓ | ✓ | 1M |
| Use Case | Recommended | Why |
|---|
| General tasks | sonnet | Best balance of quality and speed |
| Complex reasoning | opus | Highest capability |
| High-volume tasks | haiku, flash | Fast and cost-effective |
| Long documents | flash, pro | 1M token context |
| Coding | sonnet, gpt5 | Strong code understanding |
| Vision tasks | gpt4o, flash | Excellent image analysis |
| Model | Provider | Description |
|---|
dall-e-3 | OpenAI | High-quality image generation |
dall-e-2 | OpenAI | Faster, lower cost |
imagen-3 | Google | Gemini image generation |
| Model | Provider | Description |
|---|
tts-1 | OpenAI | Text-to-speech, standard quality |
tts-1-hd | OpenAI | Text-to-speech, high quality |
import { LLMist } from 'llmist';
const answer = await LLMist.createAgent()
.askAndCollect('Hello!');
const answer2 = await LLMist.createAgent()
.withModel('anthropic:claude-sonnet-4-5')
.askAndCollect('Hello!');
bunx @llmist/cli complete "Hello" --model sonnet
bunx @llmist/cli complete "Hello" --model anthropic:claude-sonnet-4-5
npx @llmist/cli complete "Hello" --model sonnet
npx @llmist/cli complete "Hello" --model anthropic:claude-sonnet-4-5
llmist automatically discovers available providers based on environment variables:
| Variable | Provider |
|---|
OPENAI_API_KEY | OpenAI |
ANTHROPIC_API_KEY | Anthropic |
GEMINI_API_KEY | Google Gemini |
See Environment Variables for complete configuration.