0.0.9 • Published 6 months ago

botium-connector-chatlayer v0.0.9

Weekly downloads
-
License
MIT
Repository
-
Last release
6 months ago

Botium Connector for Chatlayer.ai

NPM

Codeship Status for codeforequity-at/botium-connector-chatlayer npm version license

This is a Botium connector for testing your Chatlayer.ai chatbot.

Did you read the Botium in a Nutshell articles? Be warned, without prior knowledge of Botium you won't be able to properly use this library!

How it works

Botium automatically starts a webhook and connects to the Chatlayer.ai REST API to receive chatbot responses.

It can be used as any other Botium connector with all Botium Stack components:

Requirements

  • Node.js and NPM
  • a Chatlayer.ai bot
  • a project directory on your workstation to hold test cases and Botium configuration

Install Botium and Chatlayer.ai Connector

When using Botium CLI:

> npm install -g botium-cli
> npm install -g botium-connector-chatlayer
> botium-cli init
> botium-cli run

When using Botium Bindings:

> npm install -g botium-bindings
> npm install -g botium-connector-chatlayer
> botium-bindings init mocha
> npm install && npm run mocha

When using Botium Box:

Already integrated into Botium Box, no setup required

Connecting Chatlayer.ai chatbot to Botium

Chatlayer.ai has an asynchronous communication model: the bot response is not part of the HTTP/JSON response, but it is sent to a webhook. This webhook is started automatically by Botium, and it has to be registered in Chatlayer first. You have to take care that this webhook is available from the public internet.

Confiure the webhook

There are two possibilities to set up the webhook endpoint.

Proxy server

Configure the webhook with the capabilities SIMPLEREST_INBOUND_PORT and SIMPLEREST_INBOUND_ENDPOINT. When starting Botium, the webhook is available at http://local-ip-address:inbound-port/input-endpoint. If your workstation is not available from public internet, you can use a service like ngrok to make it public:

> ngrok http 1234

The webhook is available at https://something.ngrok.io/input-endpoint then.

Proxy server with redis

Configure the webhook with the capability SIMPLEREST_INBOUND_REDISURL. Then start and inbound proxy with botium-cli.

> botium-cli inbound-proxy

 redis://127.0.0.1:6379
 Botium Inbound Messages proxy is listening on port 45100
 Botium Inbound Messages endpoint available at http://127.0.0.1:45100/

To make it public you can use ngrok:

> ngrok http 45100

The webhook is available at https://something.ngrok.io then.

If your proxy server is up and running, then you can register the webhook URL together with a verify token of your choice at Chatlayer.ai.

Create a botium.json in your project directory:

{
  "botium": {
    "Capabilities": {
      "PROJECTNAME": "<whatever>",
      "CONTAINERMODE": "chatlayer",
      "CHATLAYER_URL": "..."
      "CHATLAYER_VERIFY_TOKEN": "...",
      "CHATLAYER_ACCESS_TOKEN": "...",
      "SIMPLEREST_INBOUND_REDISURL": "redis://127.0.0.1:6379"
    }
  }
}

To check the configuration, run the emulator (Botium CLI required) to bring up a chat interface in your terminal window:

> botium-cli emulator

Botium setup is ready, you can begin to write your BotiumScript files.

How to start samples

  • Adapt botium.json in the sample/simple directory
  • Install packages, start inbound proxy and run the test.

In this sample we use the webhook configuration which is written under Proxy server with redis

> cd ./samples/simple
> npm install
> npm run inbound
> npm test

Supported Capabilities

Set the capability CONTAINERMODE to chatlayer to activate this connector.

CHATLAYER_URL*

Chatlayer API url

CHATLAYER_CHANNEL_ID *

Chatlayer Channel Identifier. You can find this information in the Configure Webhook dialog on chatlayer surface.

CHATLAYER_VERIFYTOKEN *

Chatlayer Webhook verify token.

CHATLAYER_ACCESS_TOKEN *

You generate one under the Tokens menu on chatlayer surface.

CHATLAYER_SESSION_DATA

Optionally you can set session data as a json object.

CHATLAYER_WELCOME_MESSAGE

Set it true if your bot has welcome/intro message.

CHATLAYER_BOT_ID

For detailed nlp data the bot id has to be set up. You can copy this from the url of your chatbot on chatlayer surface. E.g. my url is 'https://app.chatlayer.ai/bots/abcdabcd/DRAFT' then the bot id is: abcdabcd

CHATLAYER_VERSION

Set which version you use. It can be DRAFT of LIVE. The default value is DRAFT.

CHATLAYER_LANGUAGE

The language of your chatbot. The default value is en.

SIMPLEREST_INBOUND_PORT

SIMPLEREST_INBOUND_ENDPOINT

e.g. /chatlayer

SIMPLEREST_INBOUND_REDISURL

e.g. redis://127.0.0.1:6379

Roadmap

  • Support for entity asserter
0.0.9

6 months ago

0.0.8

4 years ago

0.0.7

4 years ago

0.0.5

4 years ago

0.0.4

4 years ago

0.0.6

4 years ago

0.0.3

4 years ago

0.0.2

4 years ago