| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
•
AI (ai_app)
• Discuss (mail) |
| Lines of code | 324 |
| Technical Name |
slo_ai_ollam |
| License | LGPL-3 |
Odoo AI Provider
AI Ollama
Connect your Odoo AI agents to an Ollama server with a simple local-first setup. Configure your server, choose the Ollama models you want to use, and assign the provider directly on your agents.
Requirements
Make sure Ollama is installed and your server is running before you configure the module.
pip install ollama
What You Can Configure
- Ollama host URL
- Chat model
- Embedding model
- Optional API key
1
Open Configuration Settings
Go to Configuration > Settings, open the AI section, and fill in your Ollama connection details. This is where you define the host URL and the models your agents will use.
2
Select Ollama on the Agent
Open an AI agent, choose Ollama as the LLM provider, save the record, and start testing the agent with your configured server.
Notes
- This module works with the existing Odoo AI agent flow.
- Use an embedding model compatible with your AI vector store requirements.
- No Ollama agent is created automatically; you can use it on your existing AI agents.
Please log in to comment on this module