LocalAI/backend/python/diffusers
Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
backend_diffusers.py fix: guidance_scale not work in sd (#1488) 2023-12-24 19:24:52 +01:00
backend_pb2_grpc.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
backend_pb2.py Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
diffusers-rocm.yml Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
diffusers.yml Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
install.sh fix(python): pin exllama2 (#1711) 2024-02-14 21:44:12 +01:00
Makefile Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
test.py feat(diffusers): update, add autopipeline, controlnet (#1432) 2023-12-13 19:20:22 +01:00
test.sh tests: add diffusers tests (#1419) 2023-12-11 08:20:34 +01:00

Creating a separate environment for the diffusers project

make diffusers