LocalAI/backend/python/exllama
cryptk f7aabf1b50
fix: bring everything onto the same GRPC version to fix tests (#2199)
fix: more places where we are installing grpc that need a version specified
fix: attempt to fix metal tests
fix: metal/brew is forcing an update, they don't have 1.58 available anymore

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-04-30 19:12:15 +00:00
..
exllama.py exllama(v2): fix exllamav1, add exllamav2 (#1384) 2023-12-05 08:15:37 +01:00
exllama.yml fix: bring everything onto the same GRPC version to fix tests (#2199) 2024-04-30 19:12:15 +00:00
install.sh feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
Makefile fix: dont commit generated files to git (#1993) 2024-04-13 09:37:32 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
run.sh Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00

Creating a separate environment for the exllama project

make exllama