LocalAI/backend
cryptk 86627b27f7
fix: add setuptools to all requirements-intel.txt files for python backends (#2333)
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-16 19:15:46 +02:00
..
cpp feat(llama.cpp): add distributed llama.cpp inferencing (#2324) 2024-05-15 01:17:02 +02:00
go feat(llama.cpp): do not specify backends to autoload and add llama.cpp variants (#2232) 2024-05-04 17:56:12 +02:00
python fix: add setuptools to all requirements-intel.txt files for python backends (#2333) 2024-05-16 19:15:46 +02:00
backend.proto feat(llama.cpp): add flash_attention and no_kv_offloading (#2310) 2024-05-13 19:07:51 +02:00