Anthropic Provider
Set your Anthropic API key:
export ANTHROPIC_API_KEY=sk-ant-...llmist will automatically discover and use Anthropic.
Available Models
Section titled “Available Models”| Model | Alias | Best For |
|---|---|---|
claude-opus-4-5 | opus | Complex reasoning, creative tasks |
claude-sonnet-4-5 | sonnet | Balanced performance (recommended) |
claude-haiku-4-5 | haiku | Fast, cost-effective tasks |
Usage Examples
Section titled “Usage Examples”import { LLMist } from 'llmist';
const answer = await LLMist.createAgent() .withModel('sonnet') .askAndCollect('Write a haiku about TypeScript');import { LLMist } from 'llmist';
const answer = await LLMist.createAgent() .withModel('sonnet') .withSystem('You are a helpful coding assistant. Be concise.') .askAndCollect('How do I read a file in Node.js?');import { LLMist } from 'llmist';import { readFileSync } from 'fs';
const codebase = readFileSync('src/main.ts', 'utf-8');
const answer = await LLMist.createAgent() .withModel('sonnet') .withSystem('You are a code reviewer.') .askAndCollect(`Review this code:\n\n${codebase}`);Vision (Image Input)
Section titled “Vision (Image Input)”Claude models support image input:
import { LLMist, imageFromBuffer } from 'llmist';import { readFileSync } from 'fs';
const imageBuffer = readFileSync('diagram.png');
const answer = await LLMist.createAgent() .withModel('sonnet') .askWithImage( 'Describe this diagram', imageFromBuffer(imageBuffer, 'image/png') ) .askAndCollect();Supported image formats: JPEG, PNG, GIF, WebP
Model Characteristics
Section titled “Model Characteristics”Claude Opus 4.5 Most Capable
Section titled “Claude Opus 4.5 ”- Best for complex reasoning and creative tasks
- Highest quality outputs
- Higher latency and cost
- 200K context window
Claude Sonnet 4.5 Recommended
Section titled “Claude Sonnet 4.5 ”- Best balance of capability and speed
- Great for coding, analysis, and general tasks
- Good cost efficiency
- 200K context window
Claude Haiku 4.5 Fastest
Section titled “Claude Haiku 4.5 ”- Fastest response times
- Lowest cost
- Good for simple tasks and high-volume use
- 200K context window
Configuration Options
Section titled “Configuration Options”import { LLMist, AnthropicMessagesProvider } from 'llmist';
const client = new LLMist({ autoDiscoverProviders: false, adapters: [ new AnthropicMessagesProvider({ apiKey: process.env.ANTHROPIC_API_KEY, baseUrl: 'https://api.anthropic.com', // Custom endpoint }), ],});Best Practices
Section titled “Best Practices”- Use Sonnet as default - Best balance for most tasks
- Reserve Opus for complex reasoning - When quality matters more than speed
- Use Haiku for high-volume - Simple tasks, classification, summarization
- Leverage long context - Claude handles 200K tokens well
Cost Tracking
Section titled “Cost Tracking”for await (const event of agent.run()) { if (event.type === 'llm_call_complete') { console.log('Input tokens:', event.usage?.promptTokens); console.log('Output tokens:', event.usage?.completionTokens); console.log('Estimated cost:', event.cost); }}