LocalAI/pkg
Ettore Di Giacinto 3bab307904
fix(llama): resolve lora adapters correctly from the model file (#964)
**Description**

we were otherwise expecting absolute paths. this make it relative to the
model file (as someone would expect)

**Notes for Reviewers**


**[Signed
commits](../CONTRIBUTING.md#signing-off-on-commits-developer-certificate-of-origin)**
- [ ] Yes, I signed my commits.
 

<!--
Thank you for contributing to LocalAI! 

Contributing Conventions:

1. Include descriptive PR titles with [<component-name>] prepended.
2. Build and test your changes before submitting a PR. 
3. Sign your commits

By following the community's contribution conventions upfront, the
review process will
be accelerated and your PR merged more quickly.
-->
2023-08-27 10:11:32 +02:00
..
assets feat: Update gpt4all, support multiple implementations in runtime (#472) 2023-06-01 23:38:52 +02:00
backend fix(llama): resolve lora adapters correctly from the model file (#964) 2023-08-27 10:11:32 +02:00
gallery fix: match lowercase of the input, not of the model 2023-08-08 00:46:22 +02:00
grammar feat: update integer, number and string rules - allow primitives as root types (#862) 2023-08-03 23:32:30 +02:00
grpc feat: Allow to load lora adapters for llama.cpp (#955) 2023-08-25 21:58:46 +02:00
langchain feat: add LangChainGo Huggingface backend (#446) 2023-06-01 12:00:06 +02:00
model feat: backend monitor shutdown endpoint, process based (#938) 2023-08-23 18:38:37 +02:00
stablediffusion feat: support upscaled image generation with esrgan (#509) 2023-06-05 17:21:38 +02:00
utils fix: do not break on newlines on function returns (#864) 2023-08-04 21:46:36 +02:00