LocalAI/backend
Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
cpp deps(llama.cpp): update (#1759) 2024-02-26 13:18:44 +01:00
go Fix Command Injection Vulnerability (#1778) 2024-02-29 18:32:29 +00:00
python Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
backend_grpc.pb.go transformers: correctly load automodels (#1643) 2024-01-26 00:13:21 +01:00
backend.proto Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00