LocalAI/backend/python/mamba
Ettore Di Giacinto 20136ca8b7
feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834)
* feat(elevenlabs): map elevenlabs API support to TTS

This allows elevenlabs Clients to work automatically with LocalAI by
supporting the elevenlabs API.

The elevenlabs server endpoint is implemented such as it is wired to the
TTS endpoints.

Fixes: https://github.com/mudler/LocalAI/issues/1809

* feat(openai/tts): compat layer with openai tts

Fixes: #1276

* fix: adapt tts CLI
2024-03-14 23:08:34 +01:00
..
backend_mamba.py feat(extra-backends): Improvements, adding mamba example (#1618) 2024-01-20 17:56:08 +01:00
backend_pb2_grpc.py feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
backend_pb2.py feat(tts): add Elevenlabs and OpenAI TTS compatibility layer (#1834) 2024-03-14 23:08:34 +01:00
install.sh feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
Makefile feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
README.md feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
run.sh feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
test_backend_mamba.py feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
test.sh feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00

Creating a separate environment for the mamba project

make mamba