LocalAI/backend
Ettore Di Giacinto 887b3dff04
feat: cuda transformers (#1401)
* Use cuda in transformers if available

tensorflow probably needs a different check.

Signed-off-by: Erich Schubert <kno10@users.noreply.github.com>

* feat: expose CUDA at top level

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* tests: add to tests and create workflow for py extra backends

* doc: update note on how to use core images

---------

Signed-off-by: Erich Schubert <kno10@users.noreply.github.com>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Co-authored-by: Erich Schubert <kno10@users.noreply.github.com>
2023-12-08 15:45:04 +01:00
..
cpp refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
go refactor: rename llama-stable to llama-ggml (#1287) 2023-11-18 08:18:43 +01:00
python feat: cuda transformers (#1401) 2023-12-08 15:45:04 +01:00
backend.proto refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00