LocalAI/backend/python/bark
Ettore Di Giacinto cb7512734d
transformers: correctly load automodels (#1643)
* backends(transformers): use AutoModel with LLM types

* examples: animagine-xl

* Add codellama examples
2024-01-26 00:13:21 +01:00
..
backend_pb2_grpc.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
backend_pb2.py transformers: correctly load automodels (#1643) 2024-01-26 00:13:21 +01:00
Makefile feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
test.py extras: add vllm,bark,vall-e-x tests, bump diffusers (#1422) 2023-12-12 00:39:26 +01:00
test.sh feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
ttsbark.py fix: move python header comments below shebang in some backends (#1321) 2023-11-23 15:22:37 +01:00

Creating a separate environment for ttsbark project

make ttsbark

Testing the gRPC server

<The path of your python interpreter> -m unittest test_ttsbark.py

For example

/opt/conda/envs/bark/bin/python -m unittest extra/grpc/bark/test_ttsbark.py