Skip to main content
Platform: Odin AI | Last Updated: March 2026 Credits in Enterprise Knowledge are consumed in two ways: Platform Action Credits Charged when users perform specific actions (uploading documents, sending messages, invoking tools). LLM Token Credits Charged based on actual Large Language Model (LLM) usage, calculated from: • Input tokens (prompt, document chunks, system context) • Output tokens (LLM response) LLM usage is converted into credits based on our pricing model.

Credit Consumption Table

A. Knowledge Base (KB) Ingestion

ActionCredit TypeCalculationExample
Upload document to KBWord-based1 credit per 10,000 words100,000 words → 10 credits
LLM processing during ingestionToken-basedInput + Output tokens × model rateVaries by model
Example – 100,000 Word Document
KB Processing Cost
100,000 words1 credit per 10,000 words → 10 credits
LLM Token Cost (Ingestion) LLM Token Cost is applied when the project is configured to use LLM Extraction (i.e., an LLM is used to extract the data from the document), and it is calculated based on: • Input tokens from document parsing • Output tokens generated • Model pricing

B. Chat / Agent Interaction

Fixed Platform Credits
ActionCredits
User sends a chat message1–2 credits (based on configuration)
Tool call invoked1 credit per tool call
Variable LLM Token Credits LLM credits are calculated as: (Input Tokens × Input Rate) + (Output Tokens × Output Rate) Example pricing (Claude 4.5 sample model): • Input: $3 per 1M tokens • Output: $15 per 1M tokens

Full Chat Example

Scenario User asks: “Explain the attached document.” Platform Credits
ComponentCredits
Question asked1
Tool calls (document retrieval)2
Fixed Platform Credits3 credits
LLM Token Usage
TypeTokensCost
Input~53,634 tokens~$0.161
Output~900 tokens~$0.0135
Total LLM Cost~$0.1745
Converted to credits: 17 credits Final Total Fixed platform credits: 3 LLM usage credits: 17 ───────────────────────── Total: 20 credits

Cost Calculation Formula Summary

Document Upload Word Credits = Total Words ÷ 10,000 + LLM Token Credits (Parsing) Chat Message Fixed Message Credit + Tool Call Credits + LLM Token Credits (Input + Output)

What Drives LLM Credit Usage?

LLM cost increases when: ▶ Large documents are retrieved into context ▶ Many KB chunks are injected into the prompt ▶ Responses are long or structured ▶ Multiple tool calls are triggered ▶ Higher-cost models are selected

Important Notes for Customers

✓ Word-based ingestion cost is predictable. ✓ Chat costs vary significantly depending on document size and the number of tokens used. ✓ Model pricing is configurable in Super Admin. ✓ LLM credits are consumption-based and cannot be flat-rated. ✓ Final credit total = Platform Credits + LLM Credits.