| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
Discuss (mail)
|
| Community Apps Dependencies | Show |
| Lines of code | 3071 |
| Technical Name |
llm_ollama |
| License | LGPL-3 |
| Website | https://github.com/apexive/odoo-llm |
| Versions | 16.0 18.0 |
| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
Discuss (mail)
|
| Community Apps Dependencies | Show |
| Lines of code | 3071 |
| Technical Name |
llm_ollama |
| License | LGPL-3 |
| Website | https://github.com/apexive/odoo-llm |
| Versions | 16.0 18.0 |
Ollama Provider for Odoo LLM
Connect your Odoo instance with locally deployed open-source models.
Run Llama, Mistral, Vicuna and more locally with full privacy
What is Ollama Provider?
Run open-source AI models on your own hardware
The Ollama Provider module extends the LLM Integration Base to connect with locally deployed Ollama models. This module allows you to use Llama, Mistral, Vicuna, and other open-source models directly from your Odoo instance with full privacy and control, without sending data to external APIs.
Key Benefits
Why choose local AI with Ollama
Complete Privacy
Your data never leaves your server. Run AI with complete control and compliance.
No API Costs
Run unlimited queries without per-token charges. Only pay for your hardware.
Function Calling
Enable AI models to execute functions through a standardized interface.
Configuration
Get started with local AI
- Install the module in your Odoo instance
- Set up Ollama on your server or local machine
- Navigate to LLM → Configuration → Providers
- Create a new provider and select "Ollama" as the provider type
- Enter your Ollama server URL (default: http://localhost:11434)
- Click "Fetch Models" to import available models
- Set default models for text generation
Technical Details
Requirements and dependencies
Module Information
llm, llm_tool, llm_mail_message_subtypes
ollama
Technical
LGPL-3
Supported Models
Llama 3, Mistral, Vicuna, Codellama, Phi, Gemma, and any model available in the Ollama library.
Related Modules
Build your complete AI ecosystem
Ollama Provider for Odoo LLM Integration
Local AI deployment with Ollama - privacy-focused, no API costs.
Module Type: 🔧 Provider (Local/Privacy-Focused)
Architecture
┌─────────────────────────────────────────────────────────────────┐
│ Used By (Any LLM Module) │
│ ┌─────────────┐ ┌───────────┐ ┌─────────────┐ ┌───────────┐ │
│ │llm_assistant│ │llm_thread │ │llm_knowledge│ │llm_generate│ │
│ └──────┬──────┘ └─────┬─────┘ └──────┬──────┘ └─────┬─────┘ │
└─────────┼───────────────┼───────────────┼───────────────┼───────┘
└───────────────┴───────┬───────┴───────────────┘
▼
┌───────────────────────────────────────────────┐
│ ★ llm_ollama (This Module) ★ │
│ Ollama Provider (Local AI) │
│ 🔒 Llama 3 │ Mistral │ CodeLlama │ Phi │ etc │
└─────────────────────┬─────────────────────────┘
┌───────────┴───────────┐
▼ ▼
┌───────────────────────────┐ ┌───────────────────────────┐
│ llm │ │ Ollama Server │
│ (Core Base Module) │ │ (localhost:11434) │
└───────────────────────────┘ └───────────────────────────┘
Installation
What to Install
For local AI chat (no external API):
# 1. Install Ollama on your server first curl -fsSL https://ollama.ai/install.sh | sh ollama pull llama3 # 2. Install the Odoo module odoo-bin -d your_db -i llm_assistant,llm_ollama
Why Choose Ollama?
Common Setups
| I want to... | Install |
|---|---|
| Local AI chat | llm_assistant + llm_ollama |
| Local AI + RAG | Above + llm_knowledge + llm_pgvector |
| Mixed (local + cloud) | Install both llm_ollama + llm_openai |
Features
- Connect to Ollama with proper configuration
- Support for various open-source models (Llama, Mistral, etc.)
- Text generation capabilities
- Function calling support
- Automatic model discovery
- Local deployment for privacy and control
Configuration
- Install Ollama on your server
- Navigate to LLM > Configuration > Providers
- Create provider with URL (default: http://localhost:11434)
- Click "Fetch Models" to import available models
Technical Specifications
- Version: 18.0.1.1.0
- License: LGPL-3
- Dependencies: llm
- Python Package: ollama
License
LGPL-3
© 2025 Apexive Solutions LLC
Please log in to comment on this module