LocalAI/examples/telegram-bot
Stepan 7fa5b8401d
[Telegram-bot example] Fix lint for command docker-compose (#787)
Co-authored-by: Stepan Zhashkov <steven.z@spectral-team.com>
2023-07-21 20:56:04 +02:00
..
README.md examples(telegram): add (#547) 2023-06-09 00:45:44 +02:00
docker-compose.yml [Telegram-bot example] Fix lint for command docker-compose (#787) 2023-07-21 20:56:04 +02:00

README.md

Telegram bot

Screenshot from 2023-06-09 00-36-26

This example uses a fork of chatgpt-telegram-bot to deploy a telegram bot with LocalAI instead of OpenAI.

# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI

cd LocalAI/examples/telegram-bot

git clone https://github.com/mudler/chatgpt_telegram_bot

cp -rf docker-compose.yml chatgpt_telegram_bot

cd chatgpt_telegram_bot

mv config/config.example.yml config/config.yml
mv config/config.example.env config/config.env

# Edit config/config.yml to set the telegram bot token
vim config/config.yml

# run the bot
docker-compose --env-file config/config.env up --build

Note: LocalAI is configured to download gpt4all-j in place of gpt-3.5-turbo and stablediffusion for image generation at the first start. Download size is >6GB, if your network connection is slow, adapt the docker-compose.yml file healthcheck section accordingly (replace 20m, for instance with 1h, etc.). To configure models manually, comment the PRELOAD_MODELS environment variable in the docker-compose.yml file and see for instance the chatbot-ui-manual example model directory.