LocalAI/backend/python/common-env/transformers
Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
install.sh feat: more embedded models, coqui fixes, add model usage and description (#1556) 2024-01-08 00:37:02 +01:00
Makefile Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
transformers-nvidia.yml Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
transformers-rocm.yml Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
transformers.yml Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00