Skip to Content
Menu
v 16.0 Third Party 31
Download for v 16.0 Deploy on Odoo.sh
Availability
Odoo Online
Odoo.sh
On Premise
Odoo Apps Dependencies Discuss (mail)
Community Apps Dependencies
Lines of code 2607
Technical Name llm_ollama
LicenseLGPL-3
Websitehttps://github.com/apexive/odoo-llm
You bought this module and need support? Click here!
Availability
Odoo Online
Odoo.sh
On Premise
Odoo Apps Dependencies Discuss (mail)
Community Apps Dependencies
Lines of code 2607
Technical Name llm_ollama
LicenseLGPL-3
Websitehttps://github.com/apexive/odoo-llm

Ollama Provider for Odoo LLM

Connect your Odoo instance with locally deployed open-source models.

v16.0.1.1.0 LGPL-3 GitHub Repository

This module provides integration with Ollama for accessing locally deployed open-source models with full privacy and control.

Ollama Provider Logo
Ollama Integration Banner

Overview

The Ollama Provider module extends the LLM Integration Base to connect with locally deployed Ollama models. This module allows you to use Llama, Mistral, Vicuna, and other open-source models directly from your Odoo instance with full privacy and control.

Key Capabilities

  • Text generation - Generate text responses using various open-source models
  • Function calling - Enable AI models to execute functions through a standardized interface
  • Model discovery - Automatically discover available models from your Ollama instance
  • Local deployment - Run models locally for full privacy, control, and no API costs

Configuration

Setting up the Ollama provider is straightforward:

  1. Install the module in your Odoo instance
  2. Set up Ollama on your server or local machine
  3. Navigate to LLM > Configuration > Providers
  4. Create a new provider and select "Ollama" as the provider type
  5. Enter your Ollama server URL (default: http://localhost:11434)
  6. Click "Fetch Models" to import available models
  7. Set default models for text generation

Technical Details

This module extends the base LLM integration framework with Ollama-specific implementations:

  • Implements the Ollama API client with proper configuration
  • Provides model mapping between Ollama formats and Odoo LLM formats
  • Supports function calling capabilities
  • Handles streaming responses for real-time text generation

Module Information

Name Ollama Provider for Odoo LLM
Version 16.0.1.1.0
Category Technical
License LGPL-3
Dependencies llm, llm_tool, llm_mail_message_subtypes
Author Apexive Solutions LLC

Ollama Provider for Odoo LLM

Developed by Apexive Solutions LLC

Licensed under LGPL-3

For the latest updates and additional modules, visit:

GitHub Repository

© 2025 Apexive Solutions LLC. All rights reserved.

Please log in to comment on this module

  • The author can leave a single reply to each comment.
  • This section is meant to ask simple questions or leave a rating. Every report of a problem experienced while using the module should be addressed to the author directly (refer to the following point).
  • If you want to start a discussion with the author, please use the developer contact information. They can usually be found in the description.
Please choose a rating from 1 to 5 for this module.