Availability |
Odoo Online
Odoo.sh
On Premise
|
Odoo Apps Dependencies |
Discuss (mail)
|
Community Apps Dependencies | Show |
Lines of code | 2607 |
Technical Name |
llm_ollama |
License | LGPL-3 |
Website | https://github.com/apexive/odoo-llm |
Availability |
Odoo Online
Odoo.sh
On Premise
|
Odoo Apps Dependencies |
Discuss (mail)
|
Community Apps Dependencies | Show |
Lines of code | 2607 |
Technical Name |
llm_ollama |
License | LGPL-3 |
Website | https://github.com/apexive/odoo-llm |
Ollama Provider for Odoo LLM
Connect your Odoo instance with locally deployed open-source models.
v16.0.1.1.0
LGPL-3
GitHub Repository
This module provides integration with Ollama for accessing locally deployed open-source models with full privacy and control.


Overview
The Ollama Provider module extends the LLM Integration Base to connect with locally deployed Ollama models. This module allows you to use Llama, Mistral, Vicuna, and other open-source models directly from your Odoo instance with full privacy and control.
Key Capabilities
- Text generation - Generate text responses using various open-source models
- Function calling - Enable AI models to execute functions through a standardized interface
- Model discovery - Automatically discover available models from your Ollama instance
- Local deployment - Run models locally for full privacy, control, and no API costs
Configuration
Setting up the Ollama provider is straightforward:
- Install the module in your Odoo instance
- Set up Ollama on your server or local machine
- Navigate to LLM > Configuration > Providers
- Create a new provider and select "Ollama" as the provider type
- Enter your Ollama server URL (default: http://localhost:11434)
- Click "Fetch Models" to import available models
- Set default models for text generation
Technical Details
This module extends the base LLM integration framework with Ollama-specific implementations:
- Implements the Ollama API client with proper configuration
- Provides model mapping between Ollama formats and Odoo LLM formats
- Supports function calling capabilities
- Handles streaming responses for real-time text generation
Module Information
Name | Ollama Provider for Odoo LLM |
---|---|
Version | 16.0.1.1.0 |
Category | Technical |
License | LGPL-3 |
Dependencies | llm, llm_tool, llm_mail_message_subtypes |
Author | Apexive Solutions LLC |
Please log in to comment on this module