Skip to content
llmist
Search
Ctrl
K
Cancel
GitHub
Select theme
Dark
Light
Auto
Why llmist?
Library
Getting Started
Installation
Quick Start
Configuration
Introduction
Core Concepts
Error Handling
Agent Configuration
Cost Tracking
Creating Gadgets
Gadgets (Tools)
Hooks
Human-in-the-Loop
Quick Methods
Streaming
Providers
Providers Overview
OpenAI Provider
Anthropic Provider
Gemini Provider
Advanced
Context Compaction
Custom Models
Execution Tree
Model Catalog
Multimodal
Prompt Customization
Providers
Retry Strategies
Subagents
CLI
Getting Started
CLI Introduction
CLI Installation
CLI Quick Start
Commands
CLI Reference
Configuration
CLI Configuration
Writing Gadgets
CLI Gadgets
Gadget Ecosystem
TUI & Interactivity
TUI Overview
Testing
Getting Started
Testing Introduction
Testing Overview
Testing Installation
Testing Quick Start
Mocking
Mocking LLM Responses
Testing Gadgets
Testing Gadgets
Testing Agents
Reference
Models & Aliases
Environment Variables
Gadget Examples
Block Format
Error Types
Cookbook
Cookbook
Todo Gadgets
Examples
GitHub
Select theme
Dark
Light
Auto
LLMStream
Defined in:
core/options.ts:76
Extends
Section titled “Extends”
AsyncIterable
<
LLMStreamChunk
>