LocalAI/examples/telegram-bot/README.md
2023-06-09 00:45:44 +02:00

1.3 KiB

Telegram bot

Screenshot from 2023-06-09 00-36-26

This example uses a fork of chatgpt-telegram-bot to deploy a telegram bot with LocalAI instead of OpenAI.

# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI

cd LocalAI/examples/telegram-bot

git clone https://github.com/mudler/chatgpt_telegram_bot

cp -rf docker-compose.yml chatgpt_telegram_bot

cd chatgpt_telegram_bot

mv config/config.example.yml config/config.yml
mv config/config.example.env config/config.env

# Edit config/config.yml to set the telegram bot token
vim config/config.yml

# run the bot
docker-compose --env-file config/config.env up --build

Note: LocalAI is configured to download gpt4all-j in place of gpt-3.5-turbo and stablediffusion for image generation at the first start. Download size is >6GB, if your network connection is slow, adapt the docker-compose.yml file healthcheck section accordingly (replace 20m, for instance with 1h, etc.). To configure models manually, comment the PRELOAD_MODELS environment variable in the docker-compose.yml file and see for instance the chatbot-ui-manual example model directory.