LocalAI/examples/configurations
Ettore Di Giacinto 0eae727366
🔥 add LaVA support and GPT vision API, Multiple requests for llama.cpp, return JSON types (#1254)
* wip

* wip

* Make it functional

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* wip

* Small fixups

* do not inject space on role encoding, encode img at beginning of messages

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Add examples/config defaults

* Add include dir of current source dir

* cleanup

* fixes

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fixups

* Revert "fixups"

This reverts commit f1a4731cca.

* fixes

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2023-11-11 13:14:59 +01:00
..
llava 🔥 add LaVA support and GPT vision API, Multiple requests for llama.cpp, return JSON types (#1254) 2023-11-11 13:14:59 +01:00
mistral docs(examples): Add mistral example (#1214) 2023-10-25 20:56:12 +02:00
README.md docs(examples): Add mistral example (#1214) 2023-10-25 20:56:12 +02:00

Advanced configuration

This section contains examples on how to install models manually with config files.

Prerequisites

First clone LocalAI:

git clone https://github.com/go-skynet/LocalAI

cd LocalAI

Setup the model you prefer from the examples below and then start LocalAI:

docker compose up -d --pull always

If LocalAI is already started, you can restart it with

docker compose restart

See also the getting started: https://localai.io/basics/getting_started/

Mistral

To setup mistral copy the files inside mistral in the models folder:

cp -r examples/configurations/mistral/* models/

Now download the model:

wget https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GGUF/resolve/main/mistral-7b-openorca.Q6_K.gguf -O models/mistral-7b-openorca.Q6_K.gguf