🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, and many others
Go to file
Ettore Di Giacinto 89351f1a7d
feat(embeddings): do not require to be configured (#1842)
Certain engines requires to know during model loading
if the embedding feature has to be enabled, however, it is impractical
to have to set it to ALL the backends that supports embeddings.

There are transformers and sentencentransformers that seamelessly handle
both cases, without having this settings to be explicitly enabled.

The case sussist only for ggml-based models that needs to enable
featuresets during model loading (and thus settings `embedding` is
required), however most of the other engines does not require this.

This change disables the check done at code side, making easier to use
embeddings by not having to specify explicitly `embeddings: true`.

Part of: https://github.com/mudler/LocalAI/issues/1373
2024-03-15 18:14:23 +01:00
.github feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
.vscode feat: Add more test-cases and remove dev container (#433) 2023-05-30 13:01:55 +02:00
backend fix: osx build default.metallib (#1837) 2024-03-15 08:18:58 +00:00
configuration refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
core feat(embeddings): do not require to be configured (#1842) 2024-03-15 18:14:23 +01:00
custom-ca-certs feat(certificates): add support for custom CA certificates (#880) 2023-11-01 20:10:14 +01:00
docs docs(transformers): add docs section about transformers (#1841) 2024-03-15 18:13:30 +01:00
embedded fix(doc/examples): set defaults to mirostat (#1820) 2024-03-11 19:49:03 +01:00
examples fix(doc/examples): set defaults to mirostat (#1820) 2024-03-11 19:49:03 +01:00
internal feat: cleanups, small enhancements 2023-07-04 18:58:19 +02:00
models Add docker-compose 2023-04-13 01:13:14 +02:00
pkg feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834) 2024-03-14 23:08:34 +01:00
prompt-templates Requested Changes from GPT4ALL to Luna-AI-Llama2 (#1092) 2023-09-22 11:22:17 +02:00
tests refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
.dockerignore fix: missing OpenCL libraries from docker containers during clblas docker build (#1830) 2024-03-14 08:40:37 +01:00
.env fix(docker-compose): update docker compose file (#1824) 2024-03-13 17:57:45 +01:00
.gitattributes Create .gitattributes to force git clone to keep the LF line endings on .sh files (#838) 2023-07-30 15:27:43 +02:00
.gitignore refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
.gitmodules docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
assets.go feat: Update gpt4all, support multiple implementations in runtime (#472) 2023-06-01 23:38:52 +02:00
CONTRIBUTING.md Add the CONTRIBUTING.md (#1098) 2023-09-24 14:54:55 +02:00
docker-compose.yaml fix(docker-compose): update docker compose file (#1824) 2024-03-13 17:57:45 +01:00
Dockerfile fix: missing OpenCL libraries from docker containers during clblas docker build (#1830) 2024-03-14 08:40:37 +01:00
Earthfile Rename project to LocalAI (#35) 2023-04-19 18:43:10 +02:00
Entitlements.plist Feat: OSX Local Codesigning (#1319) 2023-11-23 15:22:54 +01:00
entrypoint.sh feat: Use ubuntu as base for container images, drop deprecated ggml-transformers backends (#1689) 2024-02-08 20:12:51 +01:00
go.mod feat(model-help): display help text in markdown (#1825) 2024-03-13 21:50:46 +01:00
go.sum feat(model-help): display help text in markdown (#1825) 2024-03-13 21:50:46 +01:00
LICENSE docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
main.go feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834) 2024-03-14 23:08:34 +01:00
Makefile fix: osx build default.metallib (#1837) 2024-03-15 08:18:58 +00:00
README.md Edit links in readme and integrations page (#1796) 2024-03-05 10:14:30 +01:00
renovate.json ci: manually update deps 2023-05-04 15:01:29 +02:00
SECURITY.md Create SECURITY.md 2024-02-29 19:53:04 +01:00



LocalAI

LocalAI forks LocalAI stars LocalAI pull-requests

💡 Get help - FAQ 💭Discussions 💬 Discord 📖 Documentation website

💻 Quickstart 📣 News 🛫 Examples 🖼️ Models 🚀 Roadmap

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

Follow LocalAI_API Join LocalAI Discord Community

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thats compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.

🔥🔥 Hot topics / Roadmap

Roadmap

Hot topics (looking for contributors):

If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22

💻 Getting started

For a detailed step-by-step introduction, refer to the Getting Started guide. For those in a hurry, here's a straightforward one-liner to launch a LocalAI instance with phi-2 using docker:

docker run -ti -p 8080:8080 localai/localai:v2.9.0-ffmpeg-core phi-2

🚀 Features

💻 Usage

Check out the Getting started section in our documentation.

🔗 Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

🔗 Resources

📖 🎥 Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❤️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project:

Spectro Cloud logo_600x600px_transparent bg
Spectro Cloud
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs!

And a huge shout-out to individuals sponsoring the project by donating hardware or backing the project.

🌟 Star history

LocalAI Star history Chart

📖 License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto

🙇 Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

🤗 Contributors

This is a community project, a special thanks to our contributors! 🤗