LocalAI/pkg
Ettore Di Giacinto 1a3dedece0
dependencies(grpcio): bump to fix CI issues (#2362)
feat(grpcio): bump to fix CI issues

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-05-21 14:33:47 +02:00
..
assets feat(llama.cpp): add distributed llama.cpp inferencing (#2324) 2024-05-15 01:17:02 +02:00
downloader fix: reduce chmod permissions for created files and directories (#2137) 2024-04-26 00:47:06 +02:00
functions feat(functions): don't use yaml.MapSlice (#2354) 2024-05-20 08:31:06 +02:00
gallery feat(ui): prompt for chat, support vision, enhancements (#2259) 2024-05-08 00:42:34 +02:00
grpc refactor(application): introduce application global state (#2072) 2024-04-29 17:42:37 +00:00
langchain feat(llama.cpp): do not specify backends to autoload and add llama.cpp variants (#2232) 2024-05-04 17:56:12 +02:00
model dependencies(grpcio): bump to fix CI issues (#2362) 2024-05-21 14:33:47 +02:00
stablediffusion feat: support upscaled image generation with esrgan (#509) 2023-06-05 17:21:38 +02:00
startup feat: Galleries UI (#2104) 2024-04-23 09:22:58 +02:00
store feat(stores): Vector store backend (#1795) 2024-03-22 21:14:04 +01:00
templates fix: reduce chmod permissions for created files and directories (#2137) 2024-04-26 00:47:06 +02:00
tinydream feat: add tiny dream stable diffusion support (#1283) 2023-12-24 19:27:24 +00:00
utils feat(llama.cpp): Totally decentralized, private, distributed, p2p inference (#2343) 2024-05-20 19:17:59 +02:00
xsync feat(ui): prompt for chat, support vision, enhancements (#2259) 2024-05-08 00:42:34 +02:00
xsysinfo feat(startup): show CPU/GPU information with --debug (#2241) 2024-05-05 09:10:23 +02:00