First AI Chatbot for Dolibarr (Ticket Plus & Helpdesk Module)

Hello everyone!

I’m thrilled to announce that six months after its initial release, I’ve just updated the “Ticket Plus & Helpdesk” module with an exciting new feature: it now includes its very own AI chatbot!

This chatbot incorporates some cutting-edge features:

  1. It uses RAG (Retrieval-Augmented Generation) techniques on articles from the native “Knowledge Base” module and leverages vector embeddings technology to find the most relevant information for visitor queries, whether they’re customers, users, or suppliers.

  2. It utilizes the ChatGPT API (OpenAI) for MULTI-LANGUAGE conversational abilities. This means we can have our knowledge base articles in French, but the chatbot will UNDERSTAND AND RESPOND perfectly in OTHER LANGUAGES: Spanish, English, Catalan, Portuguese, etc.

To get started, you’ll need to:

  1. Sign up (for free) as a developer on OpenAI (https://platform.openai.com). You may need to provide your credit card details.

  2. Once your account is created, generate an “API key” (https://platform.openai.com/api-keys) and paste it into the TicketPlus module configuration.

Worried about the cost of using AI? Don’t be! The most expensive part is actually buying my module (€60, one-time payment). The API consumption is highly optimized, and the current rates for using the “gpt-4-turbo” model API are minimal (https://openai.com/api/pricing/):

  • $0.15 per MILLION INPUT TOKENS
  • $0.60 per MILLION OUTPUT TOKENS

What does this mean in practice? Well, a token is roughly equivalent to 1.5 syllables. In my tests, answering a visitor’s question using about 4 short articles from the “knowledge base” consumes approximately ONE THOUSAND TOKENS.

This means we can comfortably answer about 100 visitor questions for just ONE CENT USD. While this calculation isn’t exact, it gives you an idea of the AI cost for using a RAG-based chatbot focused on your business.

To me, this seems very reasonable: Can a visitor ask about 10 questions in a single chat conversation? So, the cost of serving 10 visitors 24/7 with good information and intelligence is just about one cent? Wow! :slight_smile:

Disclaimer

This is the first version of my module with a chatbot. Until now, it only had semantic search (also using OpenAI embeddings). You can continue using it that way, or even choose not to use AI at all. It’s up to you, but I recommend giving it a try.

That said, PLEASE, if you have any questions or suggestions for improvement, don’t hesitate to contact me. I believe this is an opportunity for all of us to natively enjoy a chatbot for our Dolibarr without any software dependencies other than the API connection to an LLM provider.

Future Plans

I’ve read in this forum for several months that there’s a lot of interest in similar chatbot technology that would allow a customer to query information from OTHER Dolibarr databases (products, for example) and even enable “conversational” creation of an order (even if it’s just in draft mode).

If you’re an LLM enthusiast like me, you know there’s so much potential. Let’s go for it! I’m counting on your collaboration.

Links