LocalAI/backend/python/bark
Ettore Di Giacinto 20136ca8b7
feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834)
* feat(elevenlabs): map elevenlabs API support to TTS

This allows elevenlabs Clients to work automatically with LocalAI by
supporting the elevenlabs API.

The elevenlabs server endpoint is implemented such as it is wired to the
TTS endpoints.

Fixes: https://github.com/mudler/LocalAI/issues/1809

* feat(openai/tts): compat layer with openai tts

Fixes: #1276

* fix: adapt tts CLI
2024-03-14 23:08:34 +01:00
..
backend_pb2_grpc.py refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
backend_pb2.py feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834) 2024-03-14 23:08:34 +01:00
Makefile feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
test.py extras: add vllm,bark,vall-e-x tests, bump diffusers (#1422) 2023-12-12 00:39:26 +01:00
test.sh feat(conda): share envs with transformer-based backends (#1465) 2023-12-21 08:35:15 +01:00
ttsbark.py fix: move python header comments below shebang in some backends (#1321) 2023-11-23 15:22:37 +01:00

Creating a separate environment for ttsbark project

make ttsbark

Testing the gRPC server

<The path of your python interpreter> -m unittest test_ttsbark.py

For example

/opt/conda/envs/bark/bin/python -m unittest extra/grpc/bark/test_ttsbark.py