| Availability |
Odoo Online
Odoo.sh
On Premise
|
| Lines of code | 5091 |
| Technical Name |
odoocraft_kafka_connector |
| License | OPL-1 |
OdooCraft Kafka ConnectorConnect Odoo to Apache Kafka. Produce, consume, and encode messages — managed entirely from the Odoo UI. Odoo 18 Community Edition Apache Kafka AVRO Schema Registry |
The Problem
Odoo is a closed loop. Data changes inside Odoo — orders confirmed, inventory moved, invoices paid — and nothing outside knows about it unless you build custom integrations from scratch. Each one is a one-off: brittle webhooks, cron-based CSV exports, or expensive middleware that sits between systems and adds latency.
Apache Kafka solves this at the architecture level. A central event bus where every system publishes what it knows and subscribes to what it needs. The problem is that Odoo has no native Kafka support. Until now there was no clean way to connect them.
What You Get
| Connection Management Configure Kafka connections directly in Odoo. SASL PLAIN, SASL SSL, and plain TCP. Certificates stored securely as binary fields. | Topic Configuration Define topics with consumer group, serialization format, batch size, DLQ topic, and custom config — all from a form view. |
| Background Workers Consumer workers start automatically with the Odoo process. No separate service needed. Supports both batch-commit and direct-commit modes. | AVRO + Schema Registry Encode and decode messages using AVRO. Schemas loaded from Odoo addons at startup, registered to Schema Registry automatically. |
| Dead Letter Queue Failed messages are wrapped in a structured envelope (payload + headers + error type + retry count) and routed to a dedicated error topic. Offset is committed; the partition keeps moving. Nothing is silently dropped. | JSON Field Widget Built-in OWL component for editing and viewing JSON data inline in Odoo forms. Syntax validation, read-only tree view, and key sorting. |
Message Logs
kafka.message.log stores an audit trail: direction, status, payload, partition, offset, timing, and errors. Browse in Kafka → Message Logs.
|
Live Dashboard Real-time Kafka dashboard on the Connection form — Consumed, Produced, Pending counters per topic, refreshed every 5 seconds directly from the broker. |
Architecture
Odoo Instance ├── kafka.connection — broker URL, auth method, SSL certs ├── kafka.topic — topic name, consumer group, format (JSON/AVRO), DLQ ├── KafkaAvroHelper — serialize / deserialize using confluent_kafka └── SchemaRegistryHelper — load .avsc files from addons, register to Schema Registry Background Workers (run inside the Odoo server process) ├── workers/__init__.py — patch PreforkServer / ThreadedServer at startup ├── KafkaWorker — main loop, partition assignment, offset management ├── KafkaWorkerBatch — accumulate messages, commit on batch or timeout └── KafkaWorkerDirect — commit each message individually, lower throughput / higher safety External ├── Apache Kafka (KRaft or ZooKeeper) └── Confluent Schema Registry (optional, required for AVRO) |
How It Works
|
Install the module and the Python dependencypip install confluent-kafka must be available in the Odoo virtualenv. Install the module from Apps, restart the server. |
|
|
Create a Connection Go to Kafka → Connections. Add your broker address, choose an auth method, and click Test Connection to verify. |
|
|
Configure a Topic Define the topic name, consumer group, message format (JSON or AVRO), batch settings, and optionally a DLQ topic for failed messages. |
|
|
Implement your handler Add a method to any Odoo model and point the topic record at it. The worker calls it for each incoming message. |
|
|
Restart and verify Workers start with the Odoo process. Logs appear under the odoocraft_kafka_connector log queue. |
Configuring Topics and Consumer Handlers
Every Kafka topic is represented by a kafka.topic record linked to a connection.
Step 1 — Create the topic record
| Field | Required | Description |
|---|---|---|
| Topic Name | Yes | Exact Kafka topic name, e.g. odoo.orders |
| Create Consumer | — | Enable to subscribe and receive messages. Disable for producer-only topics. |
| Model | If consumer | The Odoo model that contains the handler method. |
| Handle Method | If consumer | Name of the Python method to call for each message. |
| AVRO Encoding | — | Enable for AVRO; requires Schema Registry URL on the connection. |
| Need Send Error Report | — | Enable DLQ forwarding. Select the Error Topic below. |
Step 2 — Write the handler in your module
from odoo import api, models
class SaleOrderKafkaHandler(models.AbstractModel):
_name = "sale.order.kafka.handler"
_description = "Handles incoming Kafka messages for orders"
@api.model
def handle_order_message(self, message: dict, headers: dict = None) -> None:
order_id = message.get("order_id")
partner_ref = message.get("customer_ref")
amount = message.get("amount", 0)
partner = self.env["res.partner"].search(
[("ref", "=", partner_ref)], limit=1
)
if not partner:
raise ValueError(f"Unknown partner ref: {partner_ref}")
self.env["sale.order"].create({
"partner_id": partner.id,
"client_order_ref": order_id,
"order_line": [(0, 0, {
"product_id": self.env.ref("product.product_product_1").id,
"price_unit": amount,
"product_uom_qty": 1,
})],
})
|
Set Model to sale.order.kafka.handler
and Handle Method to handle_order_message.
Step 3 — Produce messages from your module
class SaleOrder(models.Model):
_inherit = "sale.order"
def action_confirm(self):
res = super().action_confirm()
topic = self.env["kafka.topic"].search(
[("name", "=", "odoo.orders")], limit=1
)
if topic:
for order in self:
topic.send_message(
{
"order_id": order.name,
"partner_id": order.partner_id.id,
"amount_total": order.amount_total,
"currency": order.currency_id.name,
"state": order.state,
},
headers={
"source": "odoo",
"correlationId": str(order.id),
},
)
return res
|
Step 4 — Enable and monitor
- Set
start_kafka_worker = Trueinodoo.confand restart Odoo. - Open the topic form and click Start Consumer.
- Go to the Connection form to see the live Kafka dashboard — Consumed, Produced, Pending counters updated every 5 seconds.
- Go to Kafka → Message Logs to inspect individual messages, payloads, and errors.
The Consumer Group field shows the auto-computed Kafka consumer group ID.
All topics on the same connection share one group. If you need independent consumers, create a separate connection record.
|
Connection Examples
SASL PLAIN (development / Confluent Cloud)
Bootstrap Servers: pkc-xxxxx.us-east-1.aws.confluent.cloud:9092 Auth Method: SASL PLAIN Username: your-api-key Password: your-api-secret |
Bootstrap Servers: broker.internal:9093 Auth Method: SASL SSL CA Certificate: (upload ca.pem) Client Certificate: (upload client.pem) Client Key: (upload client.key) Username: kafka-user Password: kafka-pass |
Bootstrap Servers: localhost:9092 Auth Method: SASL PLAIN Username: admin Password: admin-secret |
A ready-to-use docker-compose.yml for local Kafka development is included in the
docker/ directory. It starts a single-node KRaft broker with SASL PLAIN authentication and Confluent Schema Registry.
|
AVRO Configuration
To use AVRO encoding, place .avsc schema files anywhere in your Odoo addons and
configure the Schema Registry URL on the connection. Schemas are discovered and registered automatically at worker startup.
my_module/
└── schemas/
└── order_event.avsc
|
{
"type": "record",
"name": "OrderEvent",
"namespace": "com.mycompany.odoo",
"fields": [
{"name": "order_id", "type": "int"},
{"name": "state", "type": "string"},
{"name": "amount", "type": "double"},
{"name": "timestamp", "type": "string"}
]
}
|
Dead Letter Queue (DLQ)
OdooCraft Kafka Connector provides a built-in Dead Letter Queue mechanism that captures every failed message with its full diagnostic context and routes it to a dedicated error topic for later inspection and replay.
How it works
- Decode failure — the worker cannot deserialize the raw bytes. Content is Base64-encoded and forwarded to the DLQ. Offset is committed; partition keeps moving.
- Processing failure — deserialization succeeds but the handler raises an exception. The worker retries up to the configured limit, then forwards to the DLQ.
- DLQ send failure — if the error topic is unreachable, the failure is logged but the main consumer continues. No cascading crash.
DLQ message envelope
{
"original_topic": "odoo.orders",
"timestamp": "2025-11-01T14:32:07Z",
"error_type": "KeyError",
"error_message": "'partner_id' is required",
"retry_count": 3,
"headers": { "correlationId": "abc-123", "source": "shop-frontend" },
"payload": { "order_id": "SO-9981", "amount": 4500 }
}
|
Configuration
- Create a dedicated
kafka.topicrecord for the error topic (e.g.odoo.orders.dlq) on the same connection. - On the source topic form, enable Need Send Error Report and select the DLQ topic.
If Need Send Error Report is checked but no error topic is configured, the worker raises a UserError on the first failure. Always set both fields together.
|
Message Logs
| Column | Description |
|---|---|
| Direction | Consumed (in) or Produced (out) |
| Status | Success or Error |
| Payload | Full JSON payload (interactive viewer, truncated at 64 KB) |
| Partition / Offset | Exact position in the Kafka partition |
| Processing Time | Handler execution time in milliseconds (consumed messages) |
| Error | Full error message when status is Error |
Old entries can be cleaned up via Settings → Kafka Connector → Cleanup Message Logs or by configuring the built-in scheduled action.
Odoo Startup Options
start_kafka_worker = True # Enable Kafka consumer workers (default: False) kafka_start_delay = 10 # Seconds to wait before starting workers (default: 10) kafka_batch_size = 50 # Messages per batch iteration (default: 50) kafka_batch_timeout_ms = 10 # poll() timeout per slot in ms (default: 10) kafka_batch_max_wait_ms = 100 # Max wall-clock time per batch in ms (default: 100) kafka_dashboard_broker_refresh_seconds = 5 # Live dashboard lag refresh interval (default: 5) |
Set start_kafka_worker = True in odoo.conf to activate consumers.
Workers are disabled by default so the module can be installed without a Kafka broker available.
|
Screenshots
Connections list — PLAINTEXT, SASL PLAIN, and SASL SSL connections at a glance
Connection form — Broker, SASL authentication, and Schema Registry in logical groups
Live dashboard — Consumed, Produced, and Pending counters per topic refreshed every 5 seconds
Topic form — Start / Stop consumer, Reset Offset, View Logs directly from the form
AVRO mode — enable one checkbox and the Schema Registry fields appear automatically
Message Logs — produced traffic and configurable inbound logging (direction, status, timing)
Log detail — full JSON payload with syntax highlighting, partition, offset, and processing time
Requirements
| Requirement | Details |
|---|---|
| Odoo version | 18.0 Community or Enterprise |
| Python package | confluent-kafka>=2.3.0 |
| AVRO support | confluent-kafka[avro]>=2.3.0 (optional) |
| Kafka | Apache Kafka 2.8+ (KRaft or ZooKeeper) |
| Schema Registry | Confluent Schema Registry (optional, AVRO only) |
Compatibility
Designed and tested on Odoo 18 Community Edition. Does not modify any core Odoo models — all extensions are additive.
Questions or issues? Reach us at panmasiunas@gmail.com
Odoo Proprietary License v1.0 This software and associated files (the "Software") may only be used (executed, modified, executed after modifications) if you have purchased a valid license from the authors, typically via Odoo Apps, or if you have received a written agreement from the authors of the Software (see the COPYRIGHT file). You may develop Odoo modules that use the Software as a library (typically by depending on it, importing it and using its resources), but without copying any source code or material from the Software. You may distribute those modules under the license of your choice, provided that this license is compatible with the terms of the Odoo Proprietary License (For example: LGPL, MIT, or proprietary licenses similar to this one). It is forbidden to publish, distribute, sublicense, or sell copies of the Software or modified copies of the Software. The above copyright notice and this permission notice must be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Please log in to comment on this module