Ludovic Leroux
|
939411300a
|
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2
* Add vLLM model loading options
* Remove transformers-exllama
* Fix install exllama
|
2024-03-01 22:48:53 +01:00 |
|
Ettore Di Giacinto
|
cb7512734d
|
transformers: correctly load automodels (#1643)
* backends(transformers): use AutoModel with LLM types
* examples: animagine-xl
* Add codellama examples
|
2024-01-26 00:13:21 +01:00 |
|
Ettore Di Giacinto
|
7641f92cde
|
feat(diffusers): update, add autopipeline, controlnet (#1432)
* feat(diffusers): update, add autopipeline, controlenet
* tests with AutoPipeline
* simplify logic
|
2023-12-13 19:20:22 +01:00 |
|
Ettore Di Giacinto
|
2b2d6673ff
|
exllama(v2): fix exllamav1, add exllamav2 (#1384)
* fix(exllama): fix exllama deps with anaconda
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* feat(exllamav2): add exllamav2 backend
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
---------
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
|
2023-12-05 08:15:37 +01:00 |
|