LocalAI/core
cryptk 24d7dadfed
feat: kong cli refactor fixes #1955 (#1974)
* feat: migrate to alecthomas/kong for CLI

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: bring in new flag for granular log levels

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* chore: go mod tidy

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: allow loading cli flag values from ["./localai.yaml", "~/.config/localai.yaml", "/etc/localai.yaml"] in that order

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: load from .env file instead of a yaml file

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: better loading for environment files

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat(doc): add initial documentation about configuration

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: remove test log lines

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: integrate new documentation into existing pages

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: add documentation on .env files

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: cleanup some documentation table errors

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: refactor CLI logic out to it's own package under core/cli

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

---------

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-04-11 09:19:24 +02:00
..
backend fix(llama.cpp): set better defaults for llama.cpp (#1961) 2024-04-06 22:56:45 +02:00
cli feat: kong cli refactor fixes #1955 (#1974) 2024-04-11 09:19:24 +02:00
config fix(llama.cpp): set better defaults for llama.cpp (#1961) 2024-04-06 22:56:45 +02:00
http fix(llama.cpp): set better defaults for llama.cpp (#1961) 2024-04-06 22:56:45 +02:00
schema fix(llama.cpp): set better defaults for llama.cpp (#1961) 2024-04-06 22:56:45 +02:00
services feat: first pass at improving logging (#1956) 2024-04-04 09:24:22 +02:00
startup feat: first pass at improving logging (#1956) 2024-04-04 09:24:22 +02:00