mirror of
https://github.com/mudler/LocalAI.git
synced 2024-06-07 19:40:48 +00:00
Update openai-functions.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
This commit is contained in:
parent
9e8b34427a
commit
cf513efa78
@ -85,9 +85,16 @@ When running the python script, be sure to:
|
||||
The functions calls maps automatically to grammars which are currently supported only by llama.cpp, however, it is possible to turn off the use of grammars, and extract tool arguments from the LLM responses, by specifying in the YAML file `no_grammar` and a regex to map the response from the LLM:
|
||||
|
||||
```yaml
|
||||
name: model_name
|
||||
parameters:
|
||||
# Model file name
|
||||
model: model/name
|
||||
|
||||
function:
|
||||
no_grammar: true
|
||||
response_regex: "..."
|
||||
# set to true to not use grammars
|
||||
no_grammar: true
|
||||
# set a regex to extract the function tool arguments from the LLM response
|
||||
response_regex: "(?P<function>\w+)\s*\((?P<arguments>.*)\)"
|
||||
```
|
||||
|
||||
The response regex have to be a regex with named parameters to allow to scan the function name and the arguments. For instance, consider:
|
||||
|
Loading…
Reference in New Issue
Block a user