Chatbot and Artificial Intelligence

Introduction

The Pandora ITSM chatbot is made up of two services: a conversation or chat server (with its WEB client) and a conversational artificial intelligence engine (optional).

The artificial intelligence engine can be used to learn from the information entered into the Pandora ITSM knowledge base to provide quick answers to user questions.

  • Automatic: uses machine learning models to automatically provide answers to the user based on prior learning.
  • Manual: an Pandora ITSM operator responds to the user through the chat interface.
  • Mixed: Pandora ITSM assists the operator by showing possible answers to the user's questions.

The architecture consists of three fundamental elements:

  • The chat client, with which the user interacts.
  • The chat server, with which the operator interacts.
  • The prediction engine, which offers answers based on the user's questions.

Both the client and the chat server are part of an Pandora ITSM installation.

To add predictive capabilities to Hybrid Helpdesk you will need to install and activate the Prediction Engine as described in the following sections.

Chat server installation

By default it is installed when using the cloud installation method. To activate it, go to the chat Setup → Setup → ChatBot → Enable chat option:

The web service URL must be configured with the public URL of your Pandora ITSM installation, as configured in the main section of Setup:

In case it doesn't work as expected, reproduce the Node.js installation steps.

If AI has been activated (see “AI configuration”), the AI options section with the Update KB model and Update conversational model buttons will also appear.


HTTPS configuration for the Chatbot

You must activate Enable SSL, by default it has the following values:

You must replace your own certificates with the ones configured by default. Save with the Update button and check the green color in Check chat server to know if it is working correctly.

Node.js installation

curl -SsL http://firefly.artica.es/projects/integria/integria_deploy_enterprise.sh | sh
curl -sL https://rpm.nodesource.com/setup_12.x | sh
yum install -y nodejs
npm i -g pm2
cd /var/www/html/integria/extras/chat_server
npm update
pm2 start server.js

You must edit the file /etc/systemd/system/integria-chat.service so that the service starts automatically every time the operating system boots, adding the following:

[Unit]
Description=Integria-Chat-Server
After=network.target
[service]
Type=simple
ExecStart=/usr/bin/nodeserver
restart=always
# Consider creating a dedicated user for Wiki.js here:
user=root
Environment=NODE_ENV=production
WorkingDirectory=/var/www/html/integria/extras/chat_server
[Install]
WantedBy=multi-user.target

Save file and run:

systemctl daemon-reload
systemctl start integria-chat.service
systemctl enable integria-chat

Access to the chat server database must be configured, edit the file /var/www/html/integria/extras/chat_server/config/config.js and modify the necessary parameters of the database. Defaults are assumed, should be changed if you have a custom installation:

Installation of the artificial intelligence engine

Run the following commands as root (super user):

yum install python3 python36-Cython
wget http://xxxxxx/prediction_engine-latest.tgz
tar xvzf prediction_engine-latest.tgz
cd prediction_engine
./install.sh
service prediction_engine.service restart

This will start the service on port 6000/tcp.

AI Settings

The AI is configured independently on each channel. To do this, go to Support → Chat → Channel Management and edit the settings:

  • IA url: URL where the web service listens to the artificial intelligence engine. It must be in format:
HTTP://DIR_IP
  • IA Port: Default 6000. You shouldn't change this unless you had to do port forwarding somewhere in between.
  • Initial response time: Seconds in which the AI will answer the first user request.
  • Response time between conversations: Response time between question and question once the conversation has started.
  • Progress bar timeout: Time that the system will allow the operator to choose an answer from those provided before automatically answering the best option.
  • Certainly threshold: Degree of uncertainty supported by the AI when choosing an option from those offered, the value must be between 0 and 1, a value is recommended initial value of 0.5 (50%) and adjust as necessary.

Backup of custom learning models

  • Before carrying out any customization, it is recommended to make a backup of the current models:
cp -r /opt/prediction_engine/models /opt/prediction_engine/models.bak
  • To restore the models from the backup run:
cp -f /opt/prediction_engine/models.bak/\* /opt/prediction_engine/models/

Questions and Answers (Knowledge Base)

This type of modification is advanced, by default the Update KB Model and Update Conversational Model buttons must be used from the PITSM Web Console.

The question and answer template is trained from a CSV file with the following format:

Language code;Question;Answer

Example:

“es”;“pregunta 1”;“respuesta 1”
“es”;“pregunta 2”;“respuesta 2”
“en”;“question 1”;“answer 1”
“en”;“question 2”;“answer 2”

It can be obtained automatically from the Pandora ITSM database with the commands:

rm -f /opt/prediction_engine/data/integria_kb.zip 2>/dev/null
echo "SELECT id_language AS lang, title as question, data as answer FROM integria.tkb_data INTO OUTFILE '/opt/prediction_engine/data/integria_kb.csv' FIELDS TERMINATED BY ';' ENCLOSED BY '\"' LINES TERMINATED BY '\n';" | mysql -u integria -p integria && zip -j /opt/prediction_engine/data/integria_kb.zip /opt/prediction_engine/data/integria_kb.csv

To update the models:

cd /opt/prediction_engine/src
python kb_train.pyc

Dialogues (conversational)

Menu Support → Chat → Browse → Data management.

In this section of the Pandora ITSM Web Console you will be able to add conversational items classified by categories and languages.

To search for a specific conversational item, enter a keyword found in the title and/or answer and select the category and language to which it belongs.

Advanced configuration

This type of modification is advanced, by default the Update KB Model and Update Conversational Model buttons must be used from the PITSM Web Console.

The conversational model from YAML files with the following structure:

categories:
- category 1
- category 2
-...
conversations:
- - text 1
- text 2
-...
- - text 1
- text 2
- text 3
-...

The .yaml files (the file name is not important) should be placed in the directory:

/opt/prediction_engine/data/chat_xx

Being xx the ISO code of the language to be updated (en for the English language).

To update the models run:

cd /opt/prediction_engine/src
python chat_train.pyc

How to use the chat

The Pandora ITSM chatbot uses channels to define different places to have conversations between operators (Pandora ITSM users with special permissions to manage conversations) and normal users (normal Pandora ITSM users or simple anonymous visitors, if the chat is used from outside of Pandora ITSM).

Manage a channel

Menu Support → Chat → Manage Channel:

Add operators to the channel

The channel must have at least one operator, who will answer the users' chat requests. You will be able to add any user with chat operator permissions and assign them a description, an avatar (different from the one in their user file) and the languages in which they can answer chats.

Click on Edit users of the corresponding channel to add a user and then click on the Add User button:

Use the screen to define an operator, example:

Once added other users will be able to start chat.

Channel operators

In order for operators to receive audible notifications when a user opens a chat, they must be on the chat control screen. It is accessed through the menu Support → Chat → View chat.

The operator will enter one of those chats and interact with the other party:

In the event that the chat is opened by an internal user, it will indicate which user it is.

Using the chat outside of Pandora ITSM

To use the chat outside the Pandora ITSM interface, for example on a web page or in another application, press the star icon to show the snippet of JavaScript code to embed in the application:

Note that the URL that is included in the code is the URL that is defined as the public URL in the chat server options. Said URL must be accessible from where users access it, in most cases, this means that it must be a public Internet URL.

Example:

The conversation starts by clicking on the icon at the bottom right, Help button:

Back to Pandora FMS documentation index