LocalAI/core/config
Ettore Di Giacinto 180cd4ccda
fix(llama.cpp-ggml): fixup max_tokens for old backend (#2094)
fix(llama.cpp-ggml): set 0 as default for `max_tokens`

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-04-21 16:34:00 +02:00
..
application_config.go feat: fiber logs with zerlog and add trace level (#2082) 2024-04-20 10:43:37 +02:00
backend_config.go fix(llama.cpp-ggml): fixup max_tokens for old backend (#2094) 2024-04-21 16:34:00 +02:00
config_test.go refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00