LocalAI/extra/grpc/bark
Ettore Di Giacinto 0eae727366
🔥 add LaVA support and GPT vision API, Multiple requests for llama.cpp, return JSON types (#1254)
* wip

* wip

* Make it functional

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* wip

* Small fixups

* do not inject space on role encoding, encode img at beginning of messages

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Add examples/config defaults

* Add include dir of current source dir

* cleanup

* fixes

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fixups

* Revert "fixups"

This reverts commit f1a4731cca.

* fixes

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2023-11-11 13:14:59 +01:00
..
backend_pb2_grpc.py feat(diffusers): be consistent with pipelines, support also depthimg2img (#926) 2023-08-18 22:06:24 +02:00
backend_pb2.py 🔥 add LaVA support and GPT vision API, Multiple requests for llama.cpp, return JSON types (#1254) 2023-11-11 13:14:59 +01:00
Makefile feat(conda): conda environments (#1144) 2023-11-04 15:30:32 +01:00
README.md feat(conda): conda environments (#1144) 2023-11-04 15:30:32 +01:00
run.sh feat(conda): conda environments (#1144) 2023-11-04 15:30:32 +01:00
test_ttsbark.py feat(conda): conda environments (#1144) 2023-11-04 15:30:32 +01:00
ttsbark.py feat(conda): conda environments (#1144) 2023-11-04 15:30:32 +01:00
ttsbark.yml feat(conda): conda environments (#1144) 2023-11-04 15:30:32 +01:00

Creating a separate environment for ttsbark project

make ttsbark

Testing the gRPC server

<The path of your python interpreter> -m unittest test_ttsbark.py

For example

/opt/conda/envs/bark/bin/python -m unittest extra/grpc/bark/test_ttsbark.py