LocalAI/backend
ok2sh 20d637e7b7
fix: ExLlama Backend Context Size & Rope Scaling (#1311)
* fix: context_size not propagated to exllama backend

* fix: exllama rope scaling
2023-11-21 19:26:39 +01:00
..
cpp refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
go refactor: rename llama-stable to llama-ggml (#1287) 2023-11-18 08:18:43 +01:00
python fix: ExLlama Backend Context Size & Rope Scaling (#1311) 2023-11-21 19:26:39 +01:00
backend.proto refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00