2023-04-19 17:03:12 +00:00
< h1 align = "center" >
< br >
2023-07-31 22:31:40 +00:00
< img height = "300" src = "https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd" > < br >
2023-04-19 17:03:12 +00:00
LocalAI
< br >
< / h1 >
2023-03-20 20:30:55 +00:00
2023-04-24 16:10:58 +00:00
[![tests ](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml/badge.svg )](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml) [![build container images ](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml/badge.svg )](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)
2023-07-31 17:15:57 +00:00
[![Artifact Hub ](https://img.shields.io/endpoint?url=https://artifacthub.io/badge/repository/localai )](https://artifacthub.io/packages/search?repo=localai)
2023-04-24 16:10:58 +00:00
[![ ](https://dcbadge.vercel.app/api/server/uJAeKSAGDy?style=flat-square&theme=default-inverted )](https://discord.gg/uJAeKSAGDy)
2023-06-28 17:26:25 +00:00
[Documentation website ](https://localai.io/ )
**LocalAI** is a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.
2023-05-19 17:33:53 +00:00
In a nutshell:
2023-04-11 22:04:15 +00:00
2023-05-18 13:59:03 +00:00
- Local, OpenAI drop-in alternative REST API. You own your data.
2023-06-28 17:26:25 +00:00
- NO GPU required. NO Internet access is required either
- Optional, GPU Acceleration is available in `llama.cpp` -compatible LLMs. See also the [build section ](https://localai.io/basics/build/index.html ).
2023-06-28 17:38:35 +00:00
- Supports multiple models:
2023-07-27 19:56:05 +00:00
- 📖 [Text generation with GPTs ](https://localai.io/features/text-generation/ ) (`llama.cpp`, `gpt4all.cpp` , ... [:book: and more ](https://localai.io/model-compatibility/index.html#model-compatibility-table ))
2023-07-23 16:58:24 +00:00
- 🗣 [Text to Audio ](https://localai.io/features/text-to-audio/ )
- 🔈 [Audio to Text ](https://localai.io/features/audio-to-text/ ) (Audio transcription with `whisper.cpp` )
- 🎨 [Image generation with stable diffusion ](https://localai.io/features/image-generation )
- 🔥 [OpenAI functions ](https://localai.io/features/openai-functions/ ) 🆕
2023-06-28 17:26:25 +00:00
- 🏃 Once loaded the first time, it keep models loaded in memory for faster inference
- ⚡ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
LocalAI was created by [Ettore Di Giacinto ](https://github.com/mudler/ ) and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
2023-03-30 16:46:11 +00:00
2023-07-27 19:56:05 +00:00
Note that this started just as a [fun weekend project ](https://localai.io/#backstory ) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT` : the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
2023-06-28 17:26:25 +00:00
See the [Getting started ](https://localai.io/basics/getting_started/index.html ) and [examples ](https://github.com/go-skynet/LocalAI/tree/master/examples/ ) sections to learn how to use LocalAI. For a list of curated models check out the [model gallery ](https://localai.io/models/ ).
2023-04-27 08:39:01 +00:00
2023-06-20 21:32:45 +00:00
2023-05-29 21:13:42 +00:00
| [ChatGPT OSS alternative ](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui ) | [Image generation ](https://localai.io/api-endpoints/index.html#image-generation ) |
2023-05-29 21:09:19 +00:00
|------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
| ![Screenshot from 2023-04-26 23-59-55 ](https://user-images.githubusercontent.com/2420543/234715439-98d12e03-d3ce-4f94-ab54-2b256808e05e.png ) | ![b6441997879 ](https://github.com/go-skynet/LocalAI/assets/2420543/d50af51c-51b7-4f39-b6c2-bf04c403894c ) |
2023-05-19 17:33:53 +00:00
2023-06-20 21:32:45 +00:00
| [Telegram bot ](https://github.com/go-skynet/LocalAI/tree/master/examples/telegram-bot ) | [Flowise ](https://github.com/go-skynet/LocalAI/tree/master/examples/flowise ) |
|------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
![Screenshot from 2023-06-09 00-36-26 ](https://github.com/go-skynet/LocalAI/assets/2420543/e98b4305-fa2d-41cf-9d2f-1bb2d75ca902 ) | ![Screenshot from 2023-05-30 18-01-03 ](https://github.com/go-skynet/LocalAI/assets/2420543/02458782-0549-4131-971c-95ee56ec1af8 )| |
2023-05-14 16:08:42 +00:00
2023-07-02 22:52:26 +00:00
## Hot topics / Roadmap
- [x] Support for embeddings
- [x] Support for audio transcription with https://github.com/ggerganov/whisper.cpp
- [X] Support for text-to-audio
- [x] GPU/CUDA support ( https://github.com/go-skynet/LocalAI/issues/69 )
- [X] Enable automatic downloading of models from a curated gallery
- [X] Enable automatic downloading of models from HuggingFace
- [ ] Upstream our golang bindings to llama.cpp (https://github.com/ggerganov/llama.cpp/issues/351)
- [ ] Enable gallery management directly from the webui.
2023-07-23 16:58:24 +00:00
- [x] 🔥 OpenAI functions: https://github.com/go-skynet/LocalAI/issues/588
2023-07-27 19:56:05 +00:00
- [ ] 🔥 GPTQ support: https://github.com/go-skynet/LocalAI/issues/796
2023-07-02 22:52:26 +00:00
2023-05-06 17:15:22 +00:00
## News
2023-05-03 13:51:54 +00:00
2023-07-23 16:58:24 +00:00
Check the news and the release notes in the [dedicated section ](https://localai.io/basics/news/index.html )
- 🔥🔥🔥 23-07-2023: **v1.22.0** : LLaMa2, huggingface embeddings, and more ! [Changelog ](https://github.com/go-skynet/LocalAI/releases/tag/v1.22.0 )
2023-05-03 13:51:54 +00:00
2023-05-29 21:09:19 +00:00
For latest news, follow also on Twitter [@LocalAI_API ](https://twitter.com/LocalAI_API ) and [@mudler_it ](https://twitter.com/mudler_it )
2023-05-06 17:15:22 +00:00
2023-07-02 22:52:26 +00:00
## Media, Blogs, Social
- [Create a slackbot for teams and OSS projects that answer to documentation ](https://mudler.pm/posts/smart-slackbot-for-teams/ )
- [LocalAI meets k8sgpt ](https://www.youtube.com/watch?v=PKrDNuJ_dfE )
- [Question Answering on Documents locally with LangChain, LocalAI, Chroma, and GPT4All ](https://mudler.pm/posts/localai-question-answering/ )
- [Tutorial to use k8sgpt with LocalAI ](https://medium.com/@tyler_97636/k8sgpt-localai-unlock-kubernetes-superpowers-for-free-584790de9b65 )
2023-05-06 17:15:22 +00:00
## Contribute and help
To help the project you can:
2023-04-27 15:17:03 +00:00
- [Hacker news post ](https://news.ycombinator.com/item?id=35726934 ) - help us out by voting if you like this project.
2023-05-06 17:15:22 +00:00
- If you have technological skills and want to contribute to development, have a look at the open issues. If you are new you can have a look at the [good-first-issue ](https://github.com/go-skynet/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22 ) and [help-wanted ](https://github.com/go-skynet/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22 ) labels.
- If you don't have technological skills you can still help improving documentation or add examples or share your user-stories with our community, any help and contribution is welcome!
2023-04-27 13:39:48 +00:00
2023-04-12 23:13:14 +00:00
## Usage
2023-05-29 21:09:19 +00:00
Check out the [Getting started ](https://localai.io/basics/getting_started/index.html ) section. Here below you will find generic, quick instructions to get ready and use LocalAI.
2023-04-19 16:43:10 +00:00
2023-05-29 21:09:19 +00:00
The easiest way to run LocalAI is by using `docker-compose` (to build locally, see [building LocalAI ](https://localai.io/basics/build/index.html )):
2023-04-12 23:13:14 +00:00
```bash
2023-04-19 16:43:10 +00:00
git clone https://github.com/go-skynet/LocalAI
cd LocalAI
2023-04-12 23:13:14 +00:00
2023-04-27 04:18:18 +00:00
# (optional) Checkout a specific LocalAI tag
# git checkout -b build <TAG>
2023-04-12 23:13:14 +00:00
# copy your models to models/
cp your-model.bin models/
2023-04-15 23:39:07 +00:00
# (optional) Edit the .env file to set things like context size and threads
# vim .env
2023-04-12 23:13:14 +00:00
# start with docker-compose
2023-05-16 17:32:53 +00:00
docker-compose up -d --pull always
# or you can build the images with:
# docker-compose up -d --build
2023-04-12 23:13:14 +00:00
# Now API is accessible at localhost:8080
curl http://localhost:8080/v1/models
# {"object":"list","data":[{"id":"your-model.bin","object":"model"}]}
2023-04-19 15:10:29 +00:00
2023-04-12 23:13:14 +00:00
curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
"model": "your-model.bin",
"prompt": "A long time ago in a galaxy far, far away",
"temperature": 0.7
}'
```
2023-04-24 21:42:03 +00:00
### Example: Use GPT4ALL-J model
< details >
```bash
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI
2023-04-27 04:18:18 +00:00
# (optional) Checkout a specific LocalAI tag
# git checkout -b build <TAG>
2023-04-24 21:42:03 +00:00
# Download gpt4all-j to models/
wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j
# Use a template from the examples
cp -rf prompt-templates/ggml-gpt4all-j.tmpl models/
# (optional) Edit the .env file to set things like context size and threads
# vim .env
# start with docker-compose
2023-05-16 17:32:53 +00:00
docker-compose up -d --pull always
# or you can build the images with:
# docker-compose up -d --build
2023-04-24 21:42:03 +00:00
# Now API is accessible at localhost:8080
curl http://localhost:8080/v1/models
# {"object":"list","data":[{"id":"ggml-gpt4all-j","object":"model"}]}
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
"model": "ggml-gpt4all-j",
"messages": [{"role": "user", "content": "How are you?"}],
"temperature": 0.9
}'
# {"model":"ggml-gpt4all-j","choices":[{"message":{"role":"assistant","content":"I'm doing well, thanks. How about you?"}}]}
```
< / details >
2023-05-03 13:51:54 +00:00
### Build locally
< details >
In order to build the `LocalAI` container image locally you can use `docker` :
```
# build the image
2023-05-29 21:09:19 +00:00
docker build -t localai .
docker run localai
2023-05-03 13:51:54 +00:00
```
Or you can build the binary with `make` :
```
make build
```
< / details >
2023-05-29 21:09:19 +00:00
See the [build section ](https://localai.io/basics/build/index.html ) in our documentation for detailed instructions.
2023-04-27 08:39:01 +00:00
### Run LocalAI in Kubernetes
2023-05-29 21:09:19 +00:00
LocalAI can be installed inside Kubernetes with helm. See [installation instructions ](https://localai.io/basics/getting_started/index.html#run-localai-in-kubernetes ).
2023-04-27 08:42:50 +00:00
2023-05-29 21:09:19 +00:00
## Supported API endpoints
2023-05-19 17:33:53 +00:00
2023-07-21 21:10:02 +00:00
See the [list of the LocalAI features ](https://localai.io/features/index.html ) for a full tour of the available API endpoints.
2023-05-19 17:33:53 +00:00
2023-04-24 16:10:58 +00:00
## Frequently asked questions
2023-05-29 21:09:19 +00:00
See [the FAQ ](https://localai.io/faq/index.html ) section for a list of common questions.
2023-04-24 16:10:58 +00:00
2023-04-27 04:18:18 +00:00
## Projects already using LocalAI to run local models
Feel free to open up a PR to get your project listed!
- [Kairos ](https://github.com/kairos-io/kairos )
- [k8sgpt ](https://github.com/k8sgpt-ai/k8sgpt#running-local-models )
2023-05-10 12:05:44 +00:00
- [Spark ](https://github.com/cedriking/spark )
2023-05-24 09:39:56 +00:00
- [autogpt4all ](https://github.com/aorumbayev/autogpt4all )
2023-05-25 16:18:02 +00:00
- [Mods ](https://github.com/charmbracelet/mods )
2023-05-30 16:29:28 +00:00
- [Flowise ](https://github.com/FlowiseAI/Flowise )
2023-07-30 13:29:23 +00:00
- [BMO Chatbot ](https://github.com/longy2k/obsidian-bmo-chatbot )
2023-07-30 13:30:14 +00:00
- [Mattermost OpenOps ](https://openops.mattermost.com )
2023-04-24 16:10:58 +00:00
2023-07-09 12:14:54 +00:00
## Sponsors
> Do you find LocalAI useful?
Support the project by becoming [a backer or sponsor ](https://github.com/sponsors/mudler ). Your logo will show up here with a link to your website.
A huge thank you to our generous sponsors who support this project:
| ![Spectro Cloud logo_600x600px_transparent bg ](https://github.com/go-skynet/LocalAI/assets/2420543/68a6f3cb-8a65-4a4d-99b5-6417a8905512 ) |
|:-----------------------------------------------:|
| [Spectro Cloud ](https://www.spectrocloud.com/ ) |
| Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs! |
2023-04-27 08:39:01 +00:00
## Star history
2023-04-27 04:18:18 +00:00
[![LocalAI Star history Chart ](https://api.star-history.com/svg?repos=go-skynet/LocalAI&type=Date )](https://star-history.com/#go-skynet/LocalAI& Date)
2023-03-30 16:46:11 +00:00
## License
2023-05-29 21:09:19 +00:00
LocalAI is a community-driven project created by [Ettore Di Giacinto ](https://github.com/mudler/ ).
2023-04-27 08:39:01 +00:00
2023-03-30 16:46:11 +00:00
MIT
2023-05-29 21:09:19 +00:00
## Author
2023-05-03 09:45:22 +00:00
2023-05-29 21:09:19 +00:00
Ettore Di Giacinto and others
2023-05-03 09:45:22 +00:00
2023-03-30 16:46:11 +00:00
## Acknowledgements
2023-05-16 17:32:53 +00:00
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
2023-03-30 16:46:11 +00:00
- [llama.cpp ](https://github.com/ggerganov/llama.cpp )
- https://github.com/tatsu-lab/stanford_alpaca
- https://github.com/cornelk/llama-go for the initial ideas
2023-05-16 17:32:53 +00:00
- https://github.com/antimatter15/alpaca.cpp
- https://github.com/EdVince/Stable-Diffusion-NCNN
- https://github.com/ggerganov/whisper.cpp
- https://github.com/saharNooby/rwkv.cpp
2023-04-28 08:54:39 +00:00
## Contributors
< a href = "https://github.com/go-skynet/LocalAI/graphs/contributors" >
< img src = "https://contrib.rocks/image?repo=go-skynet/LocalAI" / >
< / a >