Availability |
Odoo Online
Odoo.sh
On Premise
|
Odoo Apps Dependencies |
Discuss (mail)
|
Community Apps Dependencies | Show |
Lines of code | 2975 |
Technical Name |
llm_mcp |
License | LGPL-3 |
Website | https://github.com/apexive/odoo-llm |
Availability |
Odoo Online
Odoo.sh
On Premise
|
Odoo Apps Dependencies |
Discuss (mail)
|
Community Apps Dependencies | Show |
Lines of code | 2975 |
Technical Name |
llm_mcp |
License | LGPL-3 |
Website | https://github.com/apexive/odoo-llm |
LLM MCP
Model Context Protocol Integration for Odoo AI
Connect your Odoo LLM chatbots with external tools and services using the standardized Model Context Protocol
What is Model Context Protocol?
The Model Context Protocol (MCP) is a standardized way for AI systems to discover and interact with external tools and services.
This module extends Odoo's LLM capabilities by adding support for the Model Context Protocol, enabling AI assistants in Odoo to connect with and use tools provided by external MCP-compliant servers.
By supporting this protocol, your Odoo instance becomes part of a growing ecosystem of AI tools, allowing your LLM models to execute external functions seamlessly through a standardized JSON-RPC 2.0 based protocol.
Model Context Protocol Architecture
Key Features
Auto-Discovery
Automatically discover and register tools exposed by MCP servers without manual configuration.
Seamless Execution
Execute external tools through your LLM conversations without additional user intervention.
Standardized Protocol
Full compliance with the JSON-RPC 2.0 based Model Context Protocol specification.
Standard I/O Transport
Connect to external processes using standard input/output pipes for secure communication.
Simple Management
Easy-to-use interface for managing MCP server connections and monitoring tool availability.
Integration with LLM Tools
Works with existing LLM Tool module for a consistent experience across all AI tools.
Use Cases
Code Execution
Connect to Python, JavaScript, or other code execution environments to let your LLM process data, perform calculations, or create visualizations based on user requests.
User: Can you analyze our sales data? LLM: I'll analyze that for you using Python. [Executes external Python tool via MCP] Here's a breakdown of your sales trends...
Database Operations
Give your AI the ability to query external databases or data services, analyzing information beyond what's available in Odoo.
User: Find market data for our competitors LLM: Let me check the market database. [Calls external database tool via MCP] Based on the market data, your competitors...
Custom AI Capabilities
Extend your LLM with specialized AI models for image recognition, document processing, or other domain-specific tasks.
User: What's in this product image? LLM: Let me analyze that for you. [Calls external image recognition via MCP] The image shows a product with...
External API Integration
Connect to weather services, mapping tools, or any other external API through a standardized interface without additional development.
User: How's the weather at our delivery location? LLM: Let me check that for you. [Calls weather service via MCP] The weather at the delivery location is...
Installation Guide
Prerequisites
- Odoo v16.0 or newer
-
llm
module installed -
llm_tool
module installed
Installation Steps
-
Download the module
Get the module from the Odoo App Store or GitHub repository.
-
Upload to your Odoo instance
Extract the module to your Odoo addons directory.
-
Install the module
Go to Apps menu, remove the apps filter, and search for "LLM MCP". Click Install.
-
Configure MCP servers
Navigate to LLM Configuration → MCP Servers to set up your connections.
Module Dependencies
- base - mail - llm - llm_tool
Usage Guide
1. Configure an MCP Server
Start by setting up a connection to your MCP-compliant server:
- Navigate to LLM Configuration → MCP Servers
- Click Create to add a new server
- Fill in the server details:
- Name: A descriptive name for the server
- Transport Type: Select "Standard IO"
- Command: The command to execute (e.g.,
python mcp_server.py
) - Arguments: Any command line arguments required
- Click Save to create the server configuration
2. Start the Server and Discover Tools
Once you've configured the server, start it and discover available tools:
- Click the Start Server button to establish the connection
- The module will automatically discover and register tools exposed by the server
- View the discovered tools in the Tools tab
- You can click Refresh Tools anytime to update the tool list
3. Use MCP Tools in LLM Conversations
Once tools are registered, they can be used in LLM conversations:
- Navigate to any LLM conversation interface in Odoo
- The MCP tools will be available alongside other LLM tools
- When the LLM determines a tool should be used, it will:
- Select the appropriate tool
- Provide the necessary parameters
- Execute the tool via the MCP server
- Use the results in its response
The entire process is seamless from the user's perspective - they simply interact with the LLM assistant naturally, and the appropriate tools are invoked as needed.
Technical Details
Model Context Protocol Overview
The Model Context Protocol (MCP) is a JSON-RPC 2.0 based protocol that standardizes how AI systems discover and interact with external tools. Here's how it works:
Protocol Flow
- Initialization: The client (Odoo) sends an initialization request to the server
- Tool Discovery: The client requests a list of available tools
- Tool Registration: Tools are registered in Odoo's tool registry
- Tool Execution: When needed, the client sends tool execution requests
- Result Processing: Results are returned to the LLM for incorporation in responses
Implementation Details
PipeManager
The core component that manages communication with external processes:
- Creates and manages subprocess connections
- Handles non-blocking I/O with external processes
- Manages the JSON-RPC protocol communication
- Implements connection pooling and error recovery
LLM Tool Integration
Extensions to the standard LLM Tool model:
- Adds MCP as a new tool implementation type
- Handles tool parameter validation
- Manages tool execution via MCP servers
- Processes and formats execution results
Developing MCP Servers
To create your own MCP-compliant server, you need to implement the following endpoints:
{ "jsonrpc": "2.0", "id": 1, "method": "initialize", "params": { "clientInfo": { "name": "odoo-llm-mcp", "version": "1.0.0" }, "protocolVersion": "0.1.0", "capabilities": { "tools": {} } } }
Server should respond with:
{ "jsonrpc": "2.0", "id": 1, "result": { "protocolVersion": "0.1.0", "serverInfo": { "name": "example-mcp-server", "version": "1.0.0" }, "capabilities": { "tools": {} } } }
{ "jsonrpc": "2.0", "id": 2, "method": "tools/list", "params": {} }
Server should respond with:
{ "jsonrpc": "2.0", "id": 2, "result": { "tools": [ { "name": "calculator", "description": "Performs mathematical calculations", "inputSchema": { "type": "object", "properties": { "expression": { "type": "string", "description": "Mathematical expression to evaluate" } }, "required": ["expression"] }, "annotations": { "title": "Calculator", "idempotentHint": true } } ] } }
{ "jsonrpc": "2.0", "id": 3, "method": "tools/call", "params": { "name": "calculator", "arguments": { "expression": "2 + 2 * 3" } } }
Server should respond with:
{ "jsonrpc": "2.0", "id": 3, "result": { "isError": false, "content": [ { "type": "text", "text": "8" } ] } }
For complete MCP specification, please visit:
https://github.com/llm-tools/model-context-protocolFrequently Asked Questions
Module Information
- Name: LLM MCP
- Version: 16.0.1.0.0
- Category: Technical
- Author: Apexive Solutions LLC
- Website: GitHub
- License: LGPL-3
- Dependencies: base, mail, llm, llm_tool
Ready to enhance your Odoo AI capabilities?
Connect your Odoo instance to the growing ecosystem of AI tools with Model Context Protocol
Please log in to comment on this module