mirror of
https://github.com/mudler/LocalAI.git
synced 2024-06-07 19:40:48 +00:00
32 lines
1.2 KiB
YAML
32 lines
1.2 KiB
YAML
|
name: "phi-3-chat"
|
||
|
license: mit
|
||
|
|
||
|
description: |
|
||
|
The Phi-3-Mini-4K-Instruct is a 3.8B parameters, lightweight, state-of-the-art open model trained with the Phi-3 datasets that includes both synthetic data and the filtered publicly available websites data with a focus on high-quality and reasoning dense properties. The model belongs to the Phi-3 family with the Mini version in two variants 4K and 128K which is the context length (in tokens) it can support. The model has underwent a post-training process that incorporates both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks testing common sense, language understanding, math, code, long context and logical reasoning, Phi-3 Mini-4K-Instruct showcased a robust and state-of-the-art performance among models with less than 13 billion parameters.
|
||
|
|
||
|
urls:
|
||
|
- https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf
|
||
|
|
||
|
tags:
|
||
|
- llm
|
||
|
- gguf
|
||
|
- gpu
|
||
|
- cpu
|
||
|
|
||
|
config_file: |
|
||
|
mmap: true
|
||
|
template:
|
||
|
chat_message: |
|
||
|
<|{{ .RoleName }}|>
|
||
|
{{.Content}}<|end|>
|
||
|
chat: |
|
||
|
{{.Input}}
|
||
|
<|assistant|>
|
||
|
completion: |
|
||
|
{{.Input}}
|
||
|
context_size: 4096
|
||
|
f16: true
|
||
|
stopwords:
|
||
|
- <|end|>
|
||
|
|