| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
Discuss (mail)
|
| Community Apps Dependencies | Show |
| Lines of code | 13878 |
| Technical Name |
llm_fal_ai |
| License | LGPL-3 |
| Website | https://github.com/apexive/odoo-llm |
| Versions | 16.0 18.0 |
| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Odoo Apps Dependencies |
Discuss (mail)
|
| Community Apps Dependencies | Show |
| Lines of code | 13878 |
| Technical Name |
llm_fal_ai |
| License | LGPL-3 |
| Website | https://github.com/apexive/odoo-llm |
| Versions | 16.0 18.0 |
Fal.ai Provider for Odoo LLM
Connect your Odoo instance to Fal.ai's powerful, serverless inference platform.
Use Fal.ai models with OpenAI-compatible API for fast, cost-effective inference
What is Fal.ai Provider?
Serverless AI inference for your Odoo instance
The Fal.ai Provider module builds on the Odoo LLM framework to leverage Fal's scalable, serverless infrastructure for AI tasks. Use state-of-the-art models without managing hardware or worrying about cold starts. Fal.ai offers OpenAI-compatible endpoints, enabling seamless integration.
Capabilities
What you can do with Fal.ai in Odoo
Chat Completions
Utilize conversational AI with Fal-hosted models for chat-based workflows and assistants.
Text Embeddings
Generate vector representations for semantic search and RAG applications.
Function Calling
Invoke Odoo or custom functions from LLM responses for automated workflows.
Pay-Per-Call
No infrastructure to manage, only pay for what you use with serverless pricing.
Configuration
Get started in minutes
- Install this module in your Odoo instance
- Navigate to LLM → Configuration → Providers
- Create a new provider and select "OpenAI" as the provider type
- Set the Base URL to
https://fal.run/fal-ai/openai/v1 - Enter your Fal API key from your Fal.ai dashboard
- Click "Fetch Models" to import available models
Technical Details
Requirements and dependencies
Module Information
llm, llm_tool
Technical
Mistral, Llama 3, Mixtral
LGPL-3
OpenAI Compatibility
Fal.ai follows the OpenAI API spec, so most tools and configurations that work with OpenAI can be reused with minimal changes.
Related Modules
Explore other modules in the Odoo LLM suite
FAL.ai Provider for Odoo LLM
Fast image and video generation with Flux models.
Module Type: 🔧 Provider (Fast Image/Video Generation)
Architecture
┌───────────────────────────────────────────────────────┐
│ Used By (Generation Modules) │
│ ┌─────────────┐ ┌───────────┐ │
│ │llm_assistant│ │llm_generate│ │
│ └──────┬──────┘ └─────┬─────┘ │
└────────────┼────────────────────────┼────────────────┘
└────────────┬───────────┘
▼
┌───────────────────────────────────────────┐
│ ★ llm_fal_ai (This Module) ★ │
│ FAL.ai Provider │
│ ⚡ Fast │ Flux │ Video │ Real-time │
└─────────────────────┬─────────────────────┘
│
▼
┌───────────────────────────────────────────┐
│ llm │
│ (Core Base Module) │
└───────────────────────────────────────────┘
Installation
What to Install
For fast image generation:
odoo-bin -d your_db -i llm_assistant,llm_fal_ai
Auto-Installed Dependencies
- llm (core infrastructure)
Why Choose FAL.ai?
| Feature | FAL.ai |
|---|---|
| Speed | ⚡ Very fast inference |
| Flux Models | ✅ Best Flux support |
| Video | ✅ Video generation |
| Real-time | ✅ Real-time generation |
Common Setups
| I want to... | Install |
|---|---|
| Fast image generation | llm_assistant + llm_fal_ai |
| Chat + fast images | llm_assistant + llm_openai + llm_fal_ai |
Features
- Connect to Fal.ai API with proper authentication
- Support for multiple generative AI models hosted on Fal.ai
- Text-to-image, text-to-video, and audio synthesis capabilities
- Automatic model discovery and filtering
- Async-friendly requests for long-running tasks
Configuration
- Install the module
- Navigate to LLM > Configuration > Providers
- Create a new provider and select "Fal.ai" as the provider type
- Enter your Fal.ai API key
- Click "Fetch Models" to import available models
Technical Specifications
- Version: 18.0.1.0.0
- License: LGPL-3
- Dependencies: llm
- Python Package: aiohttp (for async inference)
License
LGPL-3
© 2025 Apexive Solutions LLC
Please log in to comment on this module