🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, and many others
Go to file
Ettore Di Giacinto 6ca4d38a01
docs/examples: enhancements (#1572)
* docs: re-order sections

* fix references

* Add mixtral-instruct, tinyllama-chat, dolphin-2.5-mixtral-8x7b

* Fix link

* Minor corrections

* fix: models is a StringSlice, not a String

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* WIP: switch docs theme

* content

* Fix GH link

* enhancements

* enhancements

* Fixed how to link

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* fixups

* logo fix

* more fixups

* final touches

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>
Co-authored-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>
2024-01-18 19:41:08 +01:00
.github docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
.vscode feat: Add more test-cases and remove dev container (#433) 2023-05-30 13:01:55 +02:00
api feat: more embedded models, coqui fixes, add model usage and description (#1556) 2024-01-08 00:37:02 +01:00
backend move BUILD_GRPC_FOR_BACKEND_LLAMA logic to makefile: errors in this section now immediately fail the build (#1576) 2024-01-13 10:08:26 +01:00
custom-ca-certs feat(certificates): add support for custom CA certificates (#880) 2023-11-01 20:10:14 +01:00
docs docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
embedded docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
examples feat: add trimsuffix (#1528) 2024-01-01 14:39:42 +01:00
internal feat: cleanups, small enhancements 2023-07-04 18:58:19 +02:00
metrics Revert "[Refactor]: Core/API Split" (#1550) 2024-01-05 18:04:46 +01:00
models Add docker-compose 2023-04-13 01:13:14 +02:00
pkg feat: more embedded models, coqui fixes, add model usage and description (#1556) 2024-01-08 00:37:02 +01:00
prompt-templates Requested Changes from GPT4ALL to Luna-AI-Llama2 (#1092) 2023-09-22 11:22:17 +02:00
tests Revert "[Refactor]: Core/API Split" (#1550) 2024-01-05 18:04:46 +01:00
.dockerignore Remove .git from .dockerignore 2023-07-06 21:25:10 +02:00
.env feat: initial watchdog implementation (#1341) 2023-11-26 18:36:23 +01:00
.gitattributes Create .gitattributes to force git clone to keep the LF line endings on .sh files (#838) 2023-07-30 15:27:43 +02:00
.gitignore Revert "[Refactor]: Core/API Split" (#1550) 2024-01-05 18:04:46 +01:00
.gitmodules docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
CONTRIBUTING.md Add the CONTRIBUTING.md (#1098) 2023-09-24 14:54:55 +02:00
Dockerfile Update Dockerfile 2024-01-09 08:55:43 +01:00
Earthfile Rename project to LocalAI (#35) 2023-04-19 18:43:10 +02:00
Entitlements.plist Feat: OSX Local Codesigning (#1319) 2023-11-23 15:22:54 +01:00
LICENSE docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
Makefile ⬆️ Update ggerganov/llama.cpp (#1599) 2024-01-18 14:39:30 +01:00
README.md Update README.md 2024-01-14 10:00:46 +01:00
assets.go feat: Update gpt4all, support multiple implementations in runtime (#472) 2023-06-01 23:38:52 +02:00
docker-compose.yaml fix: update docker-compose.yaml (#1131) 2023-10-05 22:13:18 +02:00
entrypoint.sh feat(entrypoint): optionally prepare extra endpoints (#1405) 2023-12-08 20:04:13 +01:00
go.mod Revert "[Refactor]: Core/API Split" (#1550) 2024-01-05 18:04:46 +01:00
go.sum Revert "[Refactor]: Core/API Split" (#1550) 2024-01-05 18:04:46 +01:00
main.go docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
renovate.json ci: manually update deps 2023-05-04 15:01:29 +02:00

README.md



LocalAI

LocalAI forks LocalAI stars LocalAI pull-requests

💡 Get help - FAQ 💭Discussions 💬 Discord 📖 Documentation website

💻 Quickstart 📣 News 🛫 Examples 🖼️ Models 🚀 Roadmap

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

Follow LocalAI_API Join LocalAI Discord Community

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thats compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.

🔥🔥 Hot topics / Roadmap

Roadmap

Hot topics (looking for contributors):

If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22

💻 Getting started

🚀 Features

💻 Usage

Check out the Getting started section in our documentation.

🔗 Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

🔗 Resources

📖 🎥 Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❤️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project:

Spectro Cloud logo_600x600px_transparent bg
Spectro Cloud
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs!

And a huge shout-out to individuals sponsoring the project by donating hardware or backing the project.

🌟 Star history

LocalAI Star history Chart

📖 License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto

🙇 Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

🤗 Contributors

This is a community project, a special thanks to our contributors! 🤗