Credits in Odin AI are consumed in two ways:
Platform Action Credits
Charged when users perform specific actions (uploading documents, sending messages, invoking tools).
LLM Token Credits
Charged based on actual Large Language Model (LLM) usage, calculated from:
• Input tokens (prompt, document chunks, system context)
• Output tokens (LLM response)
LLM usage is converted into credits based on our pricing model.
Credit Consumption Table
A. Knowledge Base (KB) Ingestion
| Action | Type | Rate | Example |
|---|
| Upload document to KB | Word-based | 1 credit per 10,000 words | 100,000 words → 10 credits |
| LLM processing during ingestion | Token-based | Input + output tokens × model rate | Varies by model |
Example – 100,000 word document
| Item | Credits |
|---|
| 100,000 words | 10 credits |
LLM Token Cost (Ingestion)
LLM Token Cost is applied when the project is configured to use LLM Extraction (i.e., an LLM is used to extract the data from the document), and it is calculated based on:
• Input tokens from document parsing
• Output tokens generated
• Model pricing
B. Chat / Agent Interaction
Fixed platform credits
| Action | Credits |
|---|
| User sends a chat message | 1–2 (based on configuration) |
| Tool call invoked | 1 per call |
Variable LLM Token Credits
LLM credits are calculated as:
(Input Tokens × Input Rate) + (Output Tokens × Output Rate)
Example pricing (Claude 4.5 sample model):
• Input: $3 per 1M tokens
• Output: $15 per 1M tokens
C. Workflow Executions
| Action | Credits |
|---|
| Workflow execution | 1 per execution |
The cost is 1 credit per execution, regardless of the number of steps involved.
Full Chat Example
Scenario
User asks: “Explain the attached document.”
Platform credits
| Component | Credits |
|---|
| Question asked | 1 |
| Tool calls (document retrieval) | 2 |
| Subtotal | 3 |
LLM token usage
| Type | Tokens | Cost |
|---|
| Input | ~53,634 | ~$0.161 |
| Output | ~900 | ~$0.0135 |
| Total | — | ~$0.1745 |
Converted to credits: 17 credits
Final total
| Component | Credits |
|---|
| Fixed platform | 3 |
| LLM usage | 17 |
| Total | 20 |
Document Upload
Word Credits = Total Words ÷ 10,000 + LLM Token Credits (Parsing)
Chat Message
Fixed Message Credit + Tool Call Credits + LLM Token Credits (Input + Output)
What Drives LLM Credit Usage?
LLM cost increases when:
▶ Large documents are retrieved into context
▶ Many KB chunks are injected into the prompt
▶ Responses are long or structured
▶ Multiple tool calls are triggered
▶ Higher-cost models are selected
Important Notes for Customers
✓ Word-based ingestion cost is predictable.
✓ Chat costs vary significantly depending on document size and the number of tokens used.
✓ Model pricing is configurable in Super Admin.
✓ LLM credits are consumption-based and cannot be flat-rated.
✓ Final credit total = Platform Credits + LLM Credits.