LocalAI/backend/python/common-env/transformers
Koen Farell 36da11a0ee
deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference (#1975)
* Update vLLM version to add support of Command_R

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

* fix: Fixed vllm version from requirements

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

* chore: Update transformers-rocm.yml

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

* chore: Update transformers.yml version of vllm

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

---------

Signed-off-by: Koen Farell <hellios.dt@gmail.com>
2024-04-10 11:25:26 +00:00
..
install.sh feat: Token Stream support for Transformer, fix: missing package for OpenVINO (#1908) 2024-03-27 17:50:35 +01:00
Makefile feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
transformers-nvidia.yml deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference (#1975) 2024-04-10 11:25:26 +00:00
transformers-rocm.yml deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference (#1975) 2024-04-10 11:25:26 +00:00
transformers.yml deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference (#1975) 2024-04-10 11:25:26 +00:00