LocalAI/backend
Ettore Di Giacinto 06cd9ef98d
feat(extra-backends): Improvements, adding mamba example (#1618)
* feat(extra-backends): Improvements

vllm: add max_tokens, wire up stream event
mamba: fixups, adding examples for mamba-chat

* examples(mamba-chat): add

* docs: update
2024-01-20 17:56:08 +01:00
..
cpp move BUILD_GRPC_FOR_BACKEND_LLAMA logic to makefile: errors in this section now immediately fail the build (#1576) 2024-01-13 10:08:26 +01:00
go Revert "[Refactor]: Core/API Split" (#1550) 2024-01-05 18:04:46 +01:00
python feat(extra-backends): Improvements, adding mamba example (#1618) 2024-01-20 17:56:08 +01:00
backend.proto feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00