From 0a00a4b58e0dff3c61a7e017fe95acd3c75241dd Mon Sep 17 00:00:00 2001 From: antongisli Date: Tue, 2 May 2023 19:24:45 +0200 Subject: [PATCH] adding mac build and example (#151) Co-authored-by: Ettore Di Giacinto --- README.md | 37 +++++++++++++++++++++++++++++++++++ examples/chatbot-ui/README.md | 20 +++++++++++++++++++ 2 files changed, 57 insertions(+) diff --git a/README.md b/README.md index a2cb6009..a06763aa 100644 --- a/README.md +++ b/README.md @@ -382,6 +382,43 @@ Or build the binary with `make`: make build ``` +## Build on mac + +Building on Mac (M1 or M2) works, but you may need to install some prerequisites using brew. The below has been tested by one mac user and found to work. Note that this doesn't use docker to run the server: + +``` +# install build dependencies +brew install cmake +brew install go + +# clone the repo +git clone https://github.com/go-skynet/LocalAI.git + +cd LocalAI + +# build the binary +make build + +# Download gpt4all-j to models/ +wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j + +# Use a template from the examples +cp -rf prompt-templates/ggml-gpt4all-j.tmpl models/ + +# Run LocalAI +./local-ai --models-path ./models/ --debug + +# Now API is accessible at localhost:8080 +curl http://localhost:8080/v1/models + +curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{ + "model": "ggml-gpt4all-j", + "messages": [{"role": "user", "content": "How are you?"}], + "temperature": 0.9 + }' +``` + + ## Frequently asked questions Here are answers to some of the most common questions. diff --git a/examples/chatbot-ui/README.md b/examples/chatbot-ui/README.md index 75fd073f..93459bcc 100644 --- a/examples/chatbot-ui/README.md +++ b/examples/chatbot-ui/README.md @@ -22,5 +22,25 @@ wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j docker-compose up -d --build ``` +## Pointing chatbot-ui to a separately managed LocalAI service + +If you want to use the [chatbot-ui example](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui) with an externally managed LocalAI service, you can alter the `docker-compose` file so that it looks like the below. You will notice the file is smaller, because we have removed the section that would normally start the LocalAI service. Take care to update the IP address (or FQDN) that the chatbot-ui service tries to access (marked `<>` below): +``` +version: '3.6' + +services: + chatgpt: + image: ghcr.io/mckaywrigley/chatbot-ui:main + ports: + - 3000:3000 + environment: + - 'OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXX' + - 'OPENAI_API_HOST=http://<>:8080' +``` + +Once you've edited the Dockerfile, you can start it with `docker compose up`, then browse to `http://localhost:3000`. + +## Accessing chatbot-ui + Open http://localhost:3000 for the Web UI.