| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
Discuss (mail)
|
| Community Apps Dependencies | Show |
| Lines of code | 11562 |
| Technical Name |
llm_letta |
| License | LGPL-3 |
| Website | https://github.com/apexive/odoo-llm |
| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
Discuss (mail)
|
| Community Apps Dependencies | Show |
| Lines of code | 11562 |
| Technical Name |
llm_letta |
| License | LGPL-3 |
| Website | https://github.com/apexive/odoo-llm |
Letta LLM Integration
Enterprise-ready integration bringing stateful AI agents with persistent memory to your Odoo workflows.
Transform AI interactions with Letta agents that remember context and execute Odoo tools
What Makes Letta Different?
Stateful AI agents with long-term memory
Unlike traditional stateless chatbots, Letta agents remember your interactions and maintain context across sessions. Each thread gets its own dedicated agent with isolated memory, full MCP tool integration, and automatic lifecycle management.
Key Features
Enterprise-ready AI agent capabilities
Persistent Memory
Agents remember past conversations and maintain context indefinitely, building long-term relationships.
Agent-Based Conversations
Each thread gets its own Letta agent with dedicated memory and personality for personalized AI.
MCP Tool Access
Seamless integration with Odoo's MCP server gives agents access to all your configured tools.
Automatic Management
Agents are created, updated, and cleaned up automatically as you manage threads.
Streaming Responses
Real-time message streaming with support for tool calls, reasoning steps, and assistant responses.
Cloud & Self-Hosted
Deploy on Letta Cloud for instant setup or self-host with Docker for complete control.
How It Works
Simple three-step process
Thread Creation
When you create a thread with Letta provider, an agent is automatically created with configured memory, tools, and personality.
Conversation
Messages are sent to the agent which maintains full conversation history and can execute Odoo tools through MCP.
Memory Persistence
All interactions are stored in the agent's memory, allowing it to recall past conversations and build on context.
Getting Started
Choose your deployment option
Self-Hosted Setup
- Create Letta database in PostgreSQL
- Configure environment variables
- Run Docker container
- Connect to server at localhost:8283
Cloud Setup
- Sign up at cloud.letta.com
- Get your API token from settings
- Configure Letta (Cloud) provider in Odoo
- Start creating threads with AI agents
Technical Details
Requirements and dependencies
Module Information
llm_mcp_server (required)
letta-client
0.11.7+
LGPL-3
Recommended Modules
Install LLM Assistant for custom prompts, LLM Tool for custom tools, and Easy AI Chat for a user-friendly interface.
Related Modules
Explore other modules in the Odoo LLM suite
Letta Integration for Odoo LLM
Stateful AI agents with persistent memory.
Module Type: 🔌 Extension (Stateful AI Agents)
Architecture
┌───────────────────────────────────────────────────────────────┐
│ Application Layer │
│ ┌───────────────┐ ┌───────────────┐ │
│ │ llm_assistant │ │ llm_thread │ │
│ └───────┬───────┘ └───────┬───────┘ │
└────────────────┼───────────────────────────┼─────────────────┘
└─────────────┬─────────────┘
▼
┌───────────────────────────────────────────┐
│ ★ llm_letta (This Module) ★ │
│ Letta AI Integration │
│ 🧠 Memory │ MCP Tools │ Stateful Agents │
└─────────────────────┬─────────────────────┘
┌───────────┴───────────┐
▼ ▼
┌───────────────────────────┐ ┌───────────────────────────┐
│ llm │ │ Letta Server │
│ (Core Base Module) │ │ (localhost:8283 or Cloud) │
└───────────────────────────┘ │ 🧠 Persistent memory │
└───────────────────────────┘
Installation
What to Install
For stateful AI agents:
# Install Python client pip install git+https://github.com/apexive/letta-python.git@main # Start Letta server (Docker) docker compose up letta -d # Install the Odoo module odoo-bin -d your_db -i llm_letta,llm_mcp_server
Auto-Installed Dependencies
- llm (core infrastructure)
- llm_thread (conversation management)
Why Choose Letta?
| Feature | Letta |
|---|---|
| Memory | 🧠 Persistent across sessions |
| State | 💾 Stateful agents per thread |
| Tools | 🔧 MCP tool integration |
| Context | 📚 Long-term awareness |
Common Setups
| I want to... | Install |
|---|---|
| Stateful agents | llm_letta + llm_mcp_server |
| Memory + tools | llm_assistant + llm_letta + llm_mcp_server |
Features
- Persistent Memory: Agents maintain context across sessions
- Stateful Agents: Dedicated agent per Odoo thread
- MCP Tool Integration: Zero-config Odoo tool access
- Auto-sync: Tools automatically synchronized
- Flexible Deployment: Self-hosted or Letta Cloud
Configuration
Local Server (Default)
The default "Letta (Local)" provider connects to localhost:8283 - no API key needed.
Letta Cloud
- Get API token from Letta Cloud
- Configure provider with API key
- Use "Fetch Models" to sync available models
Technical Specifications
- Version: 18.0.1.0.0
- License: LGPL-3
- Dependencies: llm, llm_thread, llm_mcp_server
- Python Package: letta (apexive fork)
- External: Letta Server 0.11.7+
License
LGPL-3
© 2025 Apexive Solutions LLC
Please log in to comment on this module