TwinFin
504f2e8bf4
Update Backend Dependancies ( #1797 )
...
* Update transformers.yml
Signed-off-by: TwinFin <57421631+TwinFinz@users.noreply.github.com>
* Update transformers-rocm.yml
Signed-off-by: TwinFin <57421631+TwinFinz@users.noreply.github.com>
* Update transformers-nvidia.yml
Signed-off-by: TwinFin <57421631+TwinFinz@users.noreply.github.com>
---------
Signed-off-by: TwinFin <57421631+TwinFinz@users.noreply.github.com>
2024-03-05 10:10:00 +00:00
Ludovic Leroux
939411300a
Bump vLLM version + more options when loading models in vLLM ( #1782 )
...
* Bump vLLM version to 0.3.2
* Add vLLM model loading options
* Remove transformers-exllama
* Fix install exllama
2024-03-01 22:48:53 +01:00
Chakib Benziane
594eb468df
Add TTS dependency for cuda based builds fixes #1727 ( #1730 )
...
Signed-off-by: Chakib Benziane <contact@blob42.xyz>
2024-02-20 21:59:43 +01:00
Ettore Di Giacinto
62a02cd1fe
deps(conda): use transformers environment with autogptq ( #1555 )
2024-01-06 15:30:53 +01:00
Ettore Di Giacinto
949da7792d
deps(conda): use transformers-env with vllm,exllama(2) ( #1554 )
...
* deps(conda): use transformers with vllm
* join vllm, exllama, exllama2, split petals
2024-01-06 13:32:28 +01:00
Ettore Di Giacinto
95eb72bfd3
feat: add 🐸 coqui ( #1489 )
...
* feat: add coqui
* docs: update news
2023-12-24 19:38:54 +01:00
Ettore Di Giacinto
939187a129
env(conda): use transformers for vall-e-x ( #1481 )
2023-12-23 14:31:34 -05:00
Ettore Di Giacinto
b4b21a446b
feat(conda): share envs with transformer-based backends ( #1465 )
...
* feat(conda): share env between diffusers and bark
* Detect if env already exists
* share diffusers and petals
* tests: add petals
* Use smaller model for tests with petals
* test only model load on petals
* tests(petals): run only load model tests
* Revert "test only model load on petals"
This reverts commit 111cfa97f1
.
* move transformers and sentencetransformers to common env
* Share also transformers-musicgen
2023-12-21 08:35:15 +01:00