From cf513efa78b9c82a5d143016c73fca2940f48768 Mon Sep 17 00:00:00 2001 From: Ettore Di Giacinto Date: Fri, 10 May 2024 17:09:51 +0200 Subject: [PATCH] Update openai-functions.md Signed-off-by: Ettore Di Giacinto --- docs/content/docs/features/openai-functions.md | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/docs/content/docs/features/openai-functions.md b/docs/content/docs/features/openai-functions.md index 94802981..feb8bc74 100644 --- a/docs/content/docs/features/openai-functions.md +++ b/docs/content/docs/features/openai-functions.md @@ -85,9 +85,16 @@ When running the python script, be sure to: The functions calls maps automatically to grammars which are currently supported only by llama.cpp, however, it is possible to turn off the use of grammars, and extract tool arguments from the LLM responses, by specifying in the YAML file `no_grammar` and a regex to map the response from the LLM: ```yaml +name: model_name +parameters: + # Model file name + model: model/name + function: - no_grammar: true - response_regex: "..." + # set to true to not use grammars + no_grammar: true + # set a regex to extract the function tool arguments from the LLM response + response_regex: "(?P\w+)\s*\((?P.*)\)" ``` The response regex have to be a regex with named parameters to allow to scan the function name and the arguments. For instance, consider: