Skip to main content

llm_provider

System module that manages LLM provider instances. Automatically configured from agent brain: definitions in YAML. Supports any OpenAI-compatible API and the Anthropic native SDK.

PropertyValue
Module IDllm_provider
Version1.0.0
Typesystem (auto-loaded, hidden from agents)
Dependenciesopenai (for OpenAI-compatible providers), anthropic (for Anthropic)

Role in the Architecture

The LLM provider module is the bridge between Digitorn agents and LLM APIs. It:

  1. Auto-configures from the brain: section of each agent definition.
  2. Resolves provider URLs automatically -- provider: deepseek maps to https://api.deepseek.com/v1.
  3. Manages connections -- creates and reuses async HTTP clients with connection pooling.
  4. Handles tool calling -- detects whether the model supports native tool calling or requires text-based recovery.
  5. Normalizes responses -- provides a consistent response format regardless of provider.

Supported Providers

The module auto-resolves base URLs for common providers:

ProviderBase URLBackend
openaihttps://api.openai.com/v1openai_compat
deepseekhttps://api.deepseek.com/v1openai_compat
anthropichttps://api.anthropic.comanthropic
groqhttps://api.groq.com/openai/v1openai_compat
mistralhttps://api.mistral.ai/v1openai_compat
togetherhttps://api.together.xyz/v1openai_compat
ollamahttp://localhost:11434/v1openai_compat
lm_studiohttp://localhost:1234/v1openai_compat
vllmhttp://localhost:8000/v1openai_compat
openrouterhttps://openrouter.ai/api/v1openai_compat

Custom providers work by specifying base_url directly in the brain config.


Configuration

The module is configured implicitly through agent brain definitions:

agents:
- id: assistant
brain:
provider: deepseek
model: deepseek-chat
temperature: 0.2
max_tokens: 4096
config:
api_key: "{{env.DEEPSEEK_API_KEY}}"

No explicit modules: llm_provider: declaration is needed.


Text-Based Tool Calling Recovery

When a model does not support native tool calling (or produces malformed tool calls), the module includes a multi-format parser that recovers tool calls from text:

  1. Native JSON tool call format
  2. XML-wrapped tool calls
  3. Markdown code block JSON
  4. Inline JSON with function names
  5. Unicode quote normalization

This makes Digitorn compatible with local models (Ollama, vLLM) that have imperfect tool calling support.


Actions (6)

These actions are for internal use and are hidden from agents:

ActionRiskDescription
configuremediumRegister a named provider instance
chatlowSend a chat completion request
removelowRemove a provider instance
list_providerslowList configured providers
update_defaultslowUpdate default generation parameters