LocalAI/examples/semantic-todo/README.md

329 B

This demonstrates the vector store backend in its simplest form. You can add tasks and then search/sort them using the TUI.

To build and run do

$ go get .
$ go run .

A seperate LocaAI instance is required of course. For e.g.

$ docker run -e DEBUG=true --rm -it -p 8080:8080 <LocalAI-image> bert-cpp