LocalAI/examples/configurations/llava/llava.yaml
lunamidori5 ccd87cd9f0
llava.yaml (yaml format standardization) (#1303)
Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>
2023-11-18 14:48:54 +01:00

20 lines
314 B
YAML

backend: llama-cpp
context_size: 4096
f16: true
threads: 11
gpu_layers: 90
mmap: true
name: llava
roles:
user: "USER:"
assistant: "ASSISTANT:"
system: "SYSTEM:"
parameters:
model: ggml-model-q4_k.gguf
temperature: 0.2
top_k: 40
top_p: 0.95
template:
chat: chat-simple
mmproj: mmproj-model-f16.gguf