🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, and many others
Go to file
Dave ed5734ae25
test/fix: OSX Test Repair (#1843)
* test with gguf instead of ggml. Updates testPrompt to match? Adds debugging line to Dockerfile that I've found helpful recently.

* fix testPrompt slightly

* Sad Experiment: Test GH runner without metal?

* break apart CGO_LDFLAGS

* switch runner

* upstream llama.cpp disables Metal on Github CI!

* missed a dir from clean-tests

* CGO_LDFLAGS

* tmate failure + NO_ACCELERATE

* whisper.cpp has a metal fix

* do the exact opposite of the name of this branch, but keep it around for unrelated fixes?

* add back newlines

* add tmate to linux for testing

* update fixtures

* timeout for tmate
2024-03-18 19:19:43 +01:00
.github test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
.vscode
backend test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
configuration refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
core test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
custom-ca-certs feat(certificates): add support for custom CA certificates (#880) 2023-11-01 20:10:14 +01:00
docs ⬆️ Update docs version mudler/LocalAI (#1847) 2024-03-17 23:08:32 +01:00
embedded fix(doc/examples): set defaults to mirostat (#1820) 2024-03-11 19:49:03 +01:00
examples fix(doc/examples): set defaults to mirostat (#1820) 2024-03-11 19:49:03 +01:00
internal
models
pkg fix(go-llama): use llama-cpp as default (#1849) 2024-03-17 23:08:22 +01:00
prompt-templates
tests test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
.dockerignore fix(make): allow to parallelize jobs (#1845) 2024-03-17 15:39:20 +01:00
.env fix(docker-compose): update docker compose file (#1824) 2024-03-13 17:57:45 +01:00
.gitattributes
.gitignore refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
.gitmodules docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
assets.go
CONTRIBUTING.md
docker-compose.yaml fix(docker-compose): update docker compose file (#1824) 2024-03-13 17:57:45 +01:00
Dockerfile test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
Earthfile
Entitlements.plist Feat: OSX Local Codesigning (#1319) 2023-11-23 15:22:54 +01:00
entrypoint.sh feat: Use ubuntu as base for container images, drop deprecated ggml-transformers backends (#1689) 2024-02-08 20:12:51 +01:00
go.mod feat(model-help): display help text in markdown (#1825) 2024-03-13 21:50:46 +01:00
go.sum feat(model-help): display help text in markdown (#1825) 2024-03-13 21:50:46 +01:00
LICENSE docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
main.go fix(config-watcher): start only if config-directory exists (#1854) 2024-03-18 19:14:48 +01:00
Makefile test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
README.md Edit links in readme and integrations page (#1796) 2024-03-05 10:14:30 +01:00
renovate.json
SECURITY.md Create SECURITY.md 2024-02-29 19:53:04 +01:00



LocalAI

LocalAI forks LocalAI stars LocalAI pull-requests

💡 Get help - FAQ 💭Discussions 💬 Discord 📖 Documentation website

💻 Quickstart 📣 News 🛫 Examples 🖼️ Models 🚀 Roadmap

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

Follow LocalAI_API Join LocalAI Discord Community

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thats compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.

🔥🔥 Hot topics / Roadmap

Roadmap

Hot topics (looking for contributors):

If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22

💻 Getting started

For a detailed step-by-step introduction, refer to the Getting Started guide. For those in a hurry, here's a straightforward one-liner to launch a LocalAI instance with phi-2 using docker:

docker run -ti -p 8080:8080 localai/localai:v2.9.0-ffmpeg-core phi-2

🚀 Features

💻 Usage

Check out the Getting started section in our documentation.

🔗 Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

🔗 Resources

📖 🎥 Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❤️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project:

Spectro Cloud logo_600x600px_transparent bg
Spectro Cloud
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs!

And a huge shout-out to individuals sponsoring the project by donating hardware or backing the project.

🌟 Star history

LocalAI Star history Chart

📖 License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto

🙇 Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

🤗 Contributors

This is a community project, a special thanks to our contributors! 🤗