| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
•
AI (ai_app)
• Discuss (mail) |
| Lines of code | 519 |
| Technical Name |
hm_ai_ollama |
| License | LGPL-3 |
AI Ollama
Use Ollama as a provider for Odoo AI Agents
This module extends Odoo AI so your AI Agents can use models served by Ollama. It is designed for teams that want to run local or remote Ollama models instead of relying only on hosted providers.
Features
- Adds an Ollama provider in the AI > Providers settings section.
- Supports both local Ollama and remote Ollama-compatible endpoints.
- Loads available models dynamically from the Ollama server.
- Maps generic aliases such as qwen3 or gemma4 to installed local models when needed.
- Integrates with the existing Odoo AI Agent flow without modifying the Odoo core module directly.
How It Works
After installation, open the AI settings page and configure the Ollama base URL. The module fetches available models from the configured Ollama server and makes them available in the AI Agent model selection.
If your server exposes models such as qwen3:8b, gemma4:e4b, or similar variants, those models can be selected directly by Odoo AI Agents.
Configuration
Typical local setup:
Ollama base URL: http://localhost:11434/v1
Optional API key: empty
Typical remote setup:
Ollama base URL: https://your-ollama-host.example.com/v1
Optional API key: set if your endpoint is protected
Screenshot
Notes
- This module depends on ai_app.
- Your Ollama server must already be running and reachable from Odoo.
- Model availability depends on what is installed on the target Ollama server.
Please log in to comment on this module