🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, and many others
Go to file
2023-12-23 14:31:34 -05:00
.github env(conda): use transformers for vall-e-x (#1481) 2023-12-23 14:31:34 -05:00
.vscode
api feat: inline templates and accept URLs in models (#1452) 2023-12-18 18:58:44 +01:00
backend env(conda): use transformers for vall-e-x (#1481) 2023-12-23 14:31:34 -05:00
custom-ca-certs feat(certificates): add support for custom CA certificates (#880) 2023-11-01 20:10:14 +01:00
docs docs: add langchain4j integration (#1476) 2023-12-23 09:13:56 +00:00
examples docs(mixtral): add mixtral example (#1449) 2023-12-16 17:44:43 +01:00
internal
metrics
models
pkg feat: inline templates and accept URLs in models (#1452) 2023-12-18 18:58:44 +01:00
prompt-templates
tests
.dockerignore
.env feat: initial watchdog implementation (#1341) 2023-11-26 18:36:23 +01:00
.gitattributes
.gitignore Feat: new backend: transformers-musicgen (#1387) 2023-12-08 10:01:02 +01:00
.gitmodules docs: Initial import from localai-website (#1312) 2023-11-22 18:13:50 +01:00
assets.go
CONTRIBUTING.md
docker-compose.yaml
Dockerfile Feat: new backend: transformers-musicgen (#1387) 2023-12-08 10:01:02 +01:00
Earthfile
Entitlements.plist Feat: OSX Local Codesigning (#1319) 2023-11-23 15:22:54 +01:00
entrypoint.sh feat(entrypoint): optionally prepare extra endpoints (#1405) 2023-12-08 20:04:13 +01:00
go.mod
go.sum
LICENSE
main.go feat: initial watchdog implementation (#1341) 2023-11-26 18:36:23 +01:00
Makefile ⬆️ Update ggerganov/whisper.cpp (#1480) 2023-12-23 09:11:40 +00:00
README.md docs: add aikit to integrations (#1412) 2023-12-12 18:58:57 +01:00
renovate.json



LocalAI

LocalAI forks LocalAI stars LocalAI pull-requests

💡 Get help - FAQ 💭Discussions 💬 Discord 📖 Documentation website

💻 Quickstart 📣 News 🛫 Examples 🖼️ Models 🚀 Roadmap

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thats compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.

Follow LocalAI

Follow LocalAI_API Join LocalAI Discord Community

Connect with the Creator

Follow mudler_it Follow on Github

Share LocalAI Repository

Follow _LocalAI Share on Telegram Share on Reddit Buy Me A Coffee

💻 Getting started

🔥🔥 Hot topics / Roadmap

Roadmap

🆕 New! LLM finetuning guide

Hot topics (looking for contributors):

If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22


In a nutshell:

  • Local, OpenAI drop-in alternative REST API. You own your data.
  • NO GPU required. NO Internet access is required either
    • Optional, GPU Acceleration is available in llama.cpp-compatible LLMs. See also the build section.
  • Supports multiple models
  • 🏃 Once loaded the first time, it keep models loaded in memory for faster inference
  • Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.

LocalAI was created by Ettore Di Giacinto and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!

Note that this started just as a fun weekend project in order to try to create the necessary pieces for a full AI assistant like ChatGPT: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!

🚀 Features

💻 Usage

Check out the Getting started section in our documentation.

🔗 Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

🔗 Resources

📖 🎥 Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❤️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project:

Spectro Cloud logo_600x600px_transparent bg
Spectro Cloud
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs!

And a huge shout-out to individuals sponsoring the project by donating hardware or backing the project.

🌟 Star history

LocalAI Star history Chart

📖 License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto

🙇 Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

🤗 Contributors

This is a community project, a special thanks to our contributors! 🤗