LocalAI/backend/python/transformers
Ettore Di Giacinto 887b3dff04
feat: cuda transformers (#1401)
* Use cuda in transformers if available

tensorflow probably needs a different check.

Signed-off-by: Erich Schubert <kno10@users.noreply.github.com>

* feat: expose CUDA at top level

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* tests: add to tests and create workflow for py extra backends

* doc: update note on how to use core images

---------

Signed-off-by: Erich Schubert <kno10@users.noreply.github.com>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Co-authored-by: Erich Schubert <kno10@users.noreply.github.com>
2023-12-08 15:45:04 +01:00
..
backend_pb2_grpc.py
backend_pb2.py
Makefile
README.md
run.sh
test_transformers_server.py feat: cuda transformers (#1401) 2023-12-08 15:45:04 +01:00
test.sh
transformers_server.py feat: cuda transformers (#1401) 2023-12-08 15:45:04 +01:00
transformers.yml

Creating a separate environment for the transformers project

make transformers