LocalAI/examples/chatbot-ui
James Braza 2ba9762255
Cleaned up chatbot-ui READMEs (#1075)
This PR cleans up the `chatbot-ui`/`-manual` examples:
- Fixes `Dockerfile` vs `docker-compose` confusion
- Makes it clear where to view the web UI in `## Run` sections

---------

Signed-off-by: James Braza <jamesbraza@gmail.com>
2023-09-18 16:43:06 +02:00
..
README.md Cleaned up chatbot-ui READMEs (#1075) 2023-09-18 16:43:06 +02:00
docker-compose.yaml examples: use gallery in chatbot-ui, add flowise (#438) 2023-05-30 18:29:28 +02:00

README.md

chatbot-ui

Example of integration with mckaywrigley/chatbot-ui.

Screenshot from 2023-04-26 23-59-55

Run

In this example LocalAI will download the gpt4all model and set it up as "gpt-3.5-turbo". See the docker-compose.yaml

# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI

cd LocalAI/examples/chatbot-ui

# start with docker-compose
docker-compose up --pull always

# or you can build the images with:
# docker-compose up -d --build

Then browse to http://localhost:3000 to view the Web UI.

Pointing chatbot-ui to a separately managed LocalAI service

If you want to use the chatbot-ui example with an externally managed LocalAI service, you can alter the docker-compose.yaml file so that it looks like the below. You will notice the file is smaller, because we have removed the section that would normally start the LocalAI service. Take care to update the IP address (or FQDN) that the chatbot-ui service tries to access (marked <<LOCALAI_IP>> below):

version: '3.6'

services:
  chatgpt:
    image: ghcr.io/mckaywrigley/chatbot-ui:main
    ports:
      - 3000:3000
    environment:
      - 'OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXX'
      - 'OPENAI_API_HOST=http://<<LOCALAI_IP>>:8080'

Once you've edited the docker-compose.yaml, you can start it with docker compose up, then browse to http://localhost:3000 to view the Web UI.

Accessing chatbot-ui

Open http://localhost:3000 for the Web UI.