LocalAI/backend/python/transformers
Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
backend_pb2_grpc.py feat(transformers): add embeddings with Automodel (#1308) 2023-11-20 21:21:17 +01:00
backend_pb2.py Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
Makefile feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
README.md feat(transformers): add embeddings with Automodel (#1308) 2023-11-20 21:21:17 +01:00
run.sh fix: rename transformers.py to avoid circular import (#1337) 2023-11-26 08:49:43 +01:00
test_transformers_server.py tests: add diffusers tests (#1419) 2023-12-11 08:20:34 +01:00
test.sh fix: rename transformers.py to avoid circular import (#1337) 2023-11-26 08:49:43 +01:00
transformers_server.py transformers: correctly load automodels (#1643) 2024-01-26 00:13:21 +01:00

Creating a separate environment for the transformers project

make transformers