llm_provider
System module that manages LLM provider instances. Automatically configured from agent brain: definitions in YAML. Supports any OpenAI-compatible API and the Anthropic native SDK.
| Property | Value |
|---|---|
| Module ID | llm_provider |
| Version | 1.0.0 |
| Type | system (auto-loaded, hidden from agents) |
| Dependencies | openai (for OpenAI-compatible providers), anthropic (for Anthropic) |
Role in the Architecture
The LLM provider module is the bridge between Digitorn agents and LLM APIs. It:
- Auto-configures from the
brain:section of each agent definition. - Resolves provider URLs automatically --
provider: deepseekmaps tohttps://api.deepseek.com/v1. - Manages connections -- creates and reuses async HTTP clients with connection pooling.
- Handles tool calling -- detects whether the model supports native tool calling or requires text-based recovery.
- Normalizes responses -- provides a consistent response format regardless of provider.
Supported Providers
The module auto-resolves base URLs for common providers:
| Provider | Base URL | Backend |
|---|---|---|
openai | https://api.openai.com/v1 | openai_compat |
deepseek | https://api.deepseek.com/v1 | openai_compat |
anthropic | https://api.anthropic.com | anthropic |
groq | https://api.groq.com/openai/v1 | openai_compat |
mistral | https://api.mistral.ai/v1 | openai_compat |
together | https://api.together.xyz/v1 | openai_compat |
ollama | http://localhost:11434/v1 | openai_compat |
lm_studio | http://localhost:1234/v1 | openai_compat |
vllm | http://localhost:8000/v1 | openai_compat |
openrouter | https://openrouter.ai/api/v1 | openai_compat |
Custom providers work by specifying base_url directly in the brain config.
Configuration
The module is configured implicitly through agent brain definitions:
agents:
- id: assistant
brain:
provider: deepseek
model: deepseek-chat
temperature: 0.2
max_tokens: 4096
config:
api_key: "{{env.DEEPSEEK_API_KEY}}"
No explicit modules: llm_provider: declaration is needed.
Text-Based Tool Calling Recovery
When a model does not support native tool calling (or produces malformed tool calls), the module includes a multi-format parser that recovers tool calls from text:
- Native JSON tool call format
- XML-wrapped tool calls
- Markdown code block JSON
- Inline JSON with function names
- Unicode quote normalization
This makes Digitorn compatible with local models (Ollama, vLLM) that have imperfect tool calling support.
Actions (6)
These actions are for internal use and are hidden from agents:
| Action | Risk | Description |
|---|---|---|
configure | medium | Register a named provider instance |
chat | low | Send a chat completion request |
remove | low | Remove a provider instance |
list_providers | low | List configured providers |
update_defaults | low | Update default generation parameters |