Compare commits

..

33 Commits

Author SHA1 Message Date
cryptk edcfb2d6b1
Merge 7407254d08 into 6559ac11b1 2024-05-08 03:39:29 +00:00
Chris Jowett 7407254d08
fix: mamba backend is cublas only
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 22:39:20 -05:00
Chris Jowett 86ccfaec05
fix: cleanup
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 21:56:10 -05:00
Chris Jowett 66230ce278
feat: make installs and tests more consistent, cleanup some deps
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:19 -05:00
Chris Jowett 72f1dcd203
feat: more size optimization work
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:19 -05:00
Chris Jowett cb2559fa26
fix: add setuptools to requirements-install for mamba
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:18 -05:00
Chris Jowett 8b566c4f43
feat: cleanup and optimization work for uv migration
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:18 -05:00
Chris Jowett 59dec935e8
fix: adjust the pwd for valle tests
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:18 -05:00
Chris Jowett 2e15003329
fix: correct filename for transformers-musicgen tests
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:17 -05:00
Chris Jowett 541cf7d412
fix: parler tests venv py dir fix
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:17 -05:00
Chris Jowett 4df197a866
fix: add some more missing dependencies to python backends
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:17 -05:00
Chris Jowett 8f55b679da
fix: add missing acclerate dependencies
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:16 -05:00
Chris Jowett 29da1e9393
fix: adjust file perms on all install/run/test scripts
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:16 -05:00
Chris Jowett 286b93cd99
fix: add uv install to the rest of test-extra.yml
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:16 -05:00
Chris Jowett 9c72f9e50d
feat: migrate vllm to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:16 -05:00
Chris Jowett f50e94480a
feat: migrate vall-e-x to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:15 -05:00
Chris Jowett 2ed5b19821
feat: migrate transformers-musicgen to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:15 -05:00
Chris Jowett 6f43129cac
feat: migrate transformers backend to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:15 -05:00
Chris Jowett 9bb524802a
fix: make sure file exists before installing on intel images
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:14 -05:00
Chris Jowett d68c40a15d
fix: install uv for tests-linux
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:14 -05:00
Chris Jowett 0295ae24b2
feat: migrate sentencetransformers to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:14 -05:00
Chris Jowett 58ab258129
feat: migrate rerankers to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:13 -05:00
Chris Jowett f649ab9caf
fix: fix tests
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:13 -05:00
Chris Jowett 933fbf91a2
feat: migrate petals to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:13 -05:00
Chris Jowett 531b1a6763
feat: migrate parler to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:12 -05:00
Chris Jowett 31aba66895
feat: migrate mamba to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:12 -05:00
Chris Jowett c333438627
feat: migrate exllama2 to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:12 -05:00
Chris Jowett ab714cd769
feat: convert exllama over to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:11 -05:00
Chris Jowett 8700238156
feat: migrate autogtpq bark coqui from conda to uv
Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:11 -05:00
Chris Jowett d7e5523538
feat: migrate diffusers backend from conda to uv
- replace conda with UV for diffusers install (prototype for all
    extras backends)
  - add ability to build docker with one/some/all extras backends
    instead of all or nothing

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
2024-05-07 20:18:11 -05:00
Ettore Di Giacinto 6559ac11b1
feat(ui): prompt for chat, support vision, enhancements (#2259)
* feat(ui): allow to set system prompt for chat

Make also the models in the index clickable, and display as table

Fixes #2257

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* feat(vision): support also png with base64 input

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* feat(ui): support vision and upload of files

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* display the processed image

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* make trust remote code stand out

Signed-off-by: mudler <mudler@localai.io>

* feat(ui): track in progress job across index/model gallery

Signed-off-by: mudler <mudler@localai.io>

* minor fixups

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>
2024-05-08 00:42:34 +02:00
Ettore Di Giacinto 02ec546dd6
models(gallery): Add Soliloquy (#2260)
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-05-08 00:14:19 +02:00
LocalAI [bot] 995aa5ed21
⬆️ Update ggerganov/llama.cpp (#2263)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-05-07 21:39:12 +00:00
71 changed files with 753 additions and 210 deletions

View File

@ -84,6 +84,7 @@ RUN apt-get update && \
apt-get install -y --no-install-recommends \
espeak-ng \
espeak \
python3-dev \
python3-venv && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@ -5,7 +5,7 @@ BINARY_NAME=local-ai
# llama.cpp versions
GOLLAMA_STABLE_VERSION?=2b57a8ae43e4699d3dc5d1496a1ccd42922993be
CPPLLAMA_VERSION?=858f6b73f6e57a62523d16a955d565254be889b4
CPPLLAMA_VERSION?=b6aa6702030320a3d5fbc2508307af0d7c947e40
# gpt4all version
GPT4ALL_REPO?=https://github.com/nomic-ai/gpt4all

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -4,4 +4,4 @@ grpcio==1.63.0
protobuf
torch
certifi
transformers==4.38.2
transformers

16
backend/python/autogptq/test.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -3,4 +3,4 @@ bark==0.1.5
grpcio==1.63.0
protobuf
certifi
transformers==4.38.2
transformers

View File

@ -1,9 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the bark tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -3,4 +3,4 @@ TTS==0.22.0
grpcio==1.63.0
protobuf
certifi
transformers==4.38.2
transformers

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the coqui tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
python -m venv ${MY_DIR}/venv
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -6,5 +6,5 @@ opencv-python
pillow
protobuf
torch
transformers==4.38.2
transformers
certifi

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,8 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
if [ "$BUILD_TYPE" != "cublas" ]; then
echo "[exllama] Attention!!! Nvidia GPU is required - skipping installation"
exit 0
@ -11,14 +13,22 @@ MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
git clone https://github.com/turboderp/exllama $MY_DIR/source
uv pip install --requirement ${MY_DIR}/source/requirements.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/source/requirements.txt
cp -rfv ./*py $MY_DIR/source/

View File

@ -1,5 +1,5 @@
grpcio==1.63.0
protobuf
torch
transformers==4.38.2
transformers
certifi

16
backend/python/exllama/test.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -4,8 +4,10 @@ set -e
## A bash script installs the required dependencies of VALL-E-X and prepares the environment
EXLLAMA2_VERSION=c0ddebaaaf8ffd1b3529c2bb654e650bce2f790f
BUILD_ISOLATION_FLAG=""
if [ "$BUILD_TYPE" != "cublas" ]; then
echo "[exllamav2] Attention!!! Nvidia GPU is required - skipping installation"
echo "[exllama] Attention!!! Nvidia GPU is required - skipping installation"
exit 0
fi
@ -14,18 +16,28 @@ MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
git clone https://github.com/turboderp/exllamav2 source
pushd source && git checkout -b build ${EXLLAMA2_VERSION} && popd
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
uv pip install --requirement ${MY_DIR}/source/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
cp -rfv ./*py $MY_DIR/source/
git clone https://github.com/turboderp/exllamav2 $MY_DIR/source
pushd ${MY_DIR}/source && git checkout -b build ${EXLLAMA2_VERSION} && popd
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/source/requirements.txt
# This installs exllamav2 in JIT mode so it will compile the appropriate torch extension at runtime
EXLLAMA_NOCOMPILE= uv pip install ${BUILD_ISOLATION_FLAG} ${MY_DIR}/source/
cp -rfv ./*py $MY_DIR/source/
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge

View File

@ -0,0 +1,4 @@
# This is here to trigger the install script to add --no-build-isolation to the uv pip install commands
# exllama2 does not specify it's build requirements per PEP517, so we need to provide some things ourselves
wheel
setuptools

View File

@ -1,4 +1,7 @@
accelerate
grpcio==1.63.0
protobuf
certifi
certifi
torch
wheel
setuptools

16
backend/python/exllama2/test.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,6 @@
.PHONY: mamba
mamba: protogen
bash install.sh
bash install.sh
.PHONY: run
run: protogen

View File

@ -1,34 +1,36 @@
#!/bin/bash
set -e
##
## A bash script installs the required dependencies of VALL-E-X and prepares the environment
set -ex
if [ "$BUILD_TYPE" != "cublas" ]; then
echo "[mamba] Attention!!! nvcc is required - skipping installation"
exit 0
fi
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
# mabma does not specify it's build dependencies per PEP517, so we need to disable build isolation
# this also means that we need to install the basic build dependencies into the venv ourselves
# https://github.com/Dao-AILab/causal-conv1d/issues/24
uv pip install --requirement ${MY_DIR}/requirements-install.txt
uv pip install --no-build-isolation --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -4,4 +4,4 @@
packaging
setuptools
wheel
torch
torch==2.2.0

View File

@ -1,7 +1,6 @@
accelerate
causal-conv1d==1.2.0.post2
mamba-ssm==1.2.0.post1
grpcio==1.63.0
torch==2.1.2
protobuf
certifi
transformers==4.38.2
transformers

View File

@ -20,7 +20,7 @@ class TestBackendServicer(unittest.TestCase):
This class contains methods to test the startup and shutdown of the gRPC service.
"""
def setUp(self):
self.service = subprocess.Popen(["python", "backend_vllm.py", "--addr", "localhost:50051"])
self.service = subprocess.Popen(["python", "backend_mamba.py", "--addr", "localhost:50051"])
time.sleep(10)
def tearDown(self) -> None:

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that sets up and runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_backend_mamba.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,14 +1,32 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
# https://github.com/descriptinc/audiotools/issues/101
@ -16,14 +34,6 @@ fi
PYDIR=$(ls $MY_DIR/venv/lib)
curl -L https://raw.githubusercontent.com/protocolbuffers/protobuf/main/python/google/protobuf/internal/builder.py -o $MY_DIR/venv/lib/$PYDIR/site-packages/google/protobuf/internal/builder.py
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -4,4 +4,4 @@ protobuf
torch
git+https://github.com/huggingface/parler-tts.git@10016fb0300c0dc31a0fb70e26f3affee7b62f16
certifi
transformers==4.38.2
transformers

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that sets up and runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_parler.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -1,3 +1,3 @@
git+https://github.com/bigscience-workshop/petals
certifi
transformers==4.38.2
transformers

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_petals.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -3,4 +3,4 @@ rerankers[transformers]
grpcio==1.63.0
protobuf
certifi
transformers==4.38.2
transformers

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_reranker.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -1,6 +1,6 @@
accelerate
sentence-transformers==2.5.1
transformers==4.38.2
transformers
grpcio==1.63.0
protobuf
certifi

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_sentencetransformers.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -1,5 +1,5 @@
accelerate
transformers==4.38.2
transformers
grpcio==1.63.0
protobuf
torch

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_transformers.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -1,5 +1,5 @@
accelerate
transformers==4.38.2
transformers
grpcio==1.63.0
protobuf
torch

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_transformers_server.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,34 +1,39 @@
#!/bin/bash
set -ex
##
## A bash script installs the required dependencies of VALL-E-X and prepares the environment
export VALL_E_X_VERSION=3faaf8ccadb154d63b38070caf518ce9309ea0f4
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
git clone https://github.com/Plachtaa/VALL-E-X.git $MY_DIR/source
pushd $MY_DIR/source && git checkout -b build $VALL_E_X_VERSION && popd
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/source/requirements.txt
uv pip install --requirement ${MY_DIR}/source/requirements.txt
cp -rfv ./*py $MY_DIR/source/
cp -rfv ./*py $MY_DIR/source/
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
pushd $MY_DIR/source && python -m unittest $MY_DIR/test.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}/source
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,22 +1,31 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
uv pip install --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi

View File

@ -0,0 +1 @@
flash-attn

View File

@ -0,0 +1,6 @@
# mabma does not specify it's build dependencies per PEP517, so we need to disable build isolation
# this also means that we need to install the basic build dependencies into the venv ourselves
# https://github.com/Dao-AILab/causal-conv1d/issues/24
packaging
setuptools
wheel

View File

@ -3,4 +3,5 @@ vllm
grpcio==1.63.0
protobuf
certifi
transformers==4.38.2
transformers
setuptools

View File

@ -1,10 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs the tests
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
python -m unittest $MY_DIR/test_backend_vllm.py
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -6,6 +6,7 @@ import (
"github.com/chasefleming/elem-go"
"github.com/chasefleming/elem-go/attrs"
"github.com/go-skynet/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/gallery"
"github.com/go-skynet/LocalAI/pkg/xsync"
)
@ -72,12 +73,13 @@ func StartProgressBar(uid, progress, text string) string {
if progress == "" {
progress = "0"
}
return elem.Div(attrs.Props{
"hx-trigger": "done",
"hx-get": "/browse/job/" + uid,
"hx-swap": "innerHTML",
"hx-target": "this",
},
return elem.Div(
attrs.Props{
"hx-trigger": "done",
"hx-get": "/browse/job/" + uid,
"hx-swap": "innerHTML",
"hx-target": "this",
},
elem.H3(
attrs.Props{
"role": "status",
@ -223,7 +225,7 @@ func deleteButton(modelName string) elem.Node {
)
}
func ListModels(models []*gallery.GalleryModel, installing *xsync.SyncedMap[string, string]) string {
func ListModels(models []*gallery.GalleryModel, processing *xsync.SyncedMap[string, string], galleryService *services.GalleryService) string {
//StartProgressBar(uid, "0")
modelsElements := []elem.Node{}
// span := func(s string) elem.Node {
@ -258,7 +260,15 @@ func ListModels(models []*gallery.GalleryModel, installing *xsync.SyncedMap[stri
actionDiv := func(m *gallery.GalleryModel) elem.Node {
galleryID := fmt.Sprintf("%s@%s", m.Gallery.Name, m.Name)
currentlyInstalling := installing.Exists(galleryID)
currentlyProcessing := processing.Exists(galleryID)
isDeletionOp := false
if currentlyProcessing {
status := galleryService.GetStatus(galleryID)
if status != nil && status.Deletion {
isDeletionOp = true
}
// if status == nil : "Waiting"
}
nodes := []elem.Node{
cardSpan("Repository: "+m.Gallery.Name, "fa-brands fa-git-alt"),
@ -292,6 +302,11 @@ func ListModels(models []*gallery.GalleryModel, installing *xsync.SyncedMap[stri
)
}
progressMessage := "Installation"
if isDeletionOp {
progressMessage = "Deletion"
}
return elem.Div(
attrs.Props{
"class": "px-6 pt-4 pb-2",
@ -303,9 +318,9 @@ func ListModels(models []*gallery.GalleryModel, installing *xsync.SyncedMap[stri
nodes...,
),
elem.If(
currentlyInstalling,
currentlyProcessing,
elem.Node( // If currently installing, show progress bar
elem.Raw(StartProgressBar(installing.Get(galleryID), "0", "Installing")),
elem.Raw(StartProgressBar(processing.Get(galleryID), "0", progressMessage)),
), // Otherwise, show install button (if not installed) or display "Installed"
elem.If(m.Installed,
elem.Node(elem.Div(
@ -331,12 +346,6 @@ func ListModels(models []*gallery.GalleryModel, installing *xsync.SyncedMap[stri
"class": "flex justify-center items-center",
}
_, trustRemoteCodeExists := m.Overrides["trust_remote_code"]
if trustRemoteCodeExists {
// should this be checking for trust_remote_code: false? I don't think we ever use that value.
divProperties["class"] = divProperties["class"] + " remote-code"
}
elems = append(elems,
elem.Div(divProperties,
@ -352,6 +361,19 @@ func ListModels(models []*gallery.GalleryModel, installing *xsync.SyncedMap[stri
),
))
_, trustRemoteCodeExists := m.Overrides["trust_remote_code"]
if trustRemoteCodeExists {
elems = append(elems, elem.Div(
attrs.Props{
"class": "flex justify-center items-center bg-red-500 text-white p-2 rounded-lg mt-2",
},
elem.I(attrs.Props{
"class": "fa-solid fa-circle-exclamation pr-2",
}),
elem.Text("Attention: Trust Remote Code is required for this model"),
))
}
elems = append(elems, descriptionDiv(m), actionDiv(m))
modelsElements = append(modelsElements,
elem.Div(

View File

@ -9,7 +9,7 @@ import (
)
func WelcomeEndpoint(appConfig *config.ApplicationConfig,
cl *config.BackendConfigLoader, ml *model.ModelLoader) func(*fiber.Ctx) error {
cl *config.BackendConfigLoader, ml *model.ModelLoader, modelStatus func() (map[string]string, map[string]string)) func(*fiber.Ctx) error {
return func(c *fiber.Ctx) error {
models, _ := ml.ListModels()
backendConfigs := cl.GetAllBackendConfigs()
@ -24,6 +24,9 @@ func WelcomeEndpoint(appConfig *config.ApplicationConfig,
galleryConfigs[m.Name] = cfg
}
// Get model statuses to display in the UI the operation in progress
processingModels, taskTypes := modelStatus()
summary := fiber.Map{
"Title": "LocalAI API - " + internal.PrintableVersion(),
"Version": internal.PrintableVersion(),
@ -31,6 +34,8 @@ func WelcomeEndpoint(appConfig *config.ApplicationConfig,
"ModelsConfig": backendConfigs,
"GalleryConfig": galleryConfigs,
"ApplicationConfig": appConfig,
"ProcessingModels": processingModels,
"TaskTypes": taskTypes,
}
if string(c.Context().Request.Header.ContentType()) == "application/json" || len(c.Accepts("html")) == 0 {

View File

@ -63,10 +63,14 @@ func getBase64Image(s string) (string, error) {
return encoded, nil
}
// if the string instead is prefixed with "data:image/jpeg;base64,", drop it
if strings.HasPrefix(s, "data:image/jpeg;base64,") {
return strings.ReplaceAll(s, "data:image/jpeg;base64,", ""), nil
// if the string instead is prefixed with "data:image/...;base64,", drop it
dropPrefix := []string{"data:image/jpeg;base64,", "data:image/png;base64,"}
for _, prefix := range dropPrefix {
if strings.HasPrefix(s, prefix) {
return strings.ReplaceAll(s, prefix, ""), nil
}
}
return "", fmt.Errorf("not valid string")
}
@ -181,7 +185,7 @@ func updateRequestConfig(config *config.BackendConfig, input *schema.OpenAIReque
input.Messages[i].StringContent = fmt.Sprintf("[img-%d]", index) + input.Messages[i].StringContent
index++
} else {
fmt.Print("Failed encoding image", err)
log.Error().Msgf("Failed encoding image: %s", err)
}
}
}

View File

@ -26,13 +26,35 @@ func RegisterUIRoutes(app *fiber.App,
galleryService *services.GalleryService,
auth func(*fiber.Ctx) error) {
app.Get("/", auth, localai.WelcomeEndpoint(appConfig, cl, ml))
// keeps the state of models that are being installed from the UI
var installingModels = xsync.NewSyncedMap[string, string]()
var processingModels = xsync.NewSyncedMap[string, string]()
// modelStatus returns the current status of the models being processed (installation or deletion)
// it is called asynchonously from the UI
modelStatus := func() (map[string]string, map[string]string) {
processingModelsData := processingModels.Map()
taskTypes := map[string]string{}
for k, v := range processingModelsData {
status := galleryService.GetStatus(v)
taskTypes[k] = "Installation"
if status != nil && status.Deletion {
taskTypes[k] = "Deletion"
} else if status == nil {
taskTypes[k] = "Waiting"
}
}
return processingModelsData, taskTypes
}
app.Get("/", auth, localai.WelcomeEndpoint(appConfig, cl, ml, modelStatus))
// Show the Models page (all models)
app.Get("/browse", auth, func(c *fiber.Ctx) error {
term := c.Query("term")
models, _ := gallery.AvailableGalleryModels(appConfig.Galleries, appConfig.ModelPath)
// Get all available tags
@ -47,12 +69,22 @@ func RegisterUIRoutes(app *fiber.App,
tags = append(tags, t)
}
sort.Strings(tags)
if term != "" {
models = gallery.GalleryModels(models).Search(term)
}
// Get model statuses
processingModelsData, taskTypes := modelStatus()
summary := fiber.Map{
"Title": "LocalAI - Models",
"Version": internal.PrintableVersion(),
"Models": template.HTML(elements.ListModels(models, installingModels)),
"Repositories": appConfig.Galleries,
"AllTags": tags,
"Title": "LocalAI - Models",
"Version": internal.PrintableVersion(),
"Models": template.HTML(elements.ListModels(models, processingModels, galleryService)),
"Repositories": appConfig.Galleries,
"AllTags": tags,
"ProcessingModels": processingModelsData,
"TaskTypes": taskTypes,
// "ApplicationConfig": appConfig,
}
@ -72,17 +104,7 @@ func RegisterUIRoutes(app *fiber.App,
models, _ := gallery.AvailableGalleryModels(appConfig.Galleries, appConfig.ModelPath)
filteredModels := []*gallery.GalleryModel{}
for _, m := range models {
if strings.Contains(m.Name, form.Search) ||
strings.Contains(m.Description, form.Search) ||
strings.Contains(m.Gallery.Name, form.Search) ||
strings.Contains(strings.Join(m.Tags, ","), form.Search) {
filteredModels = append(filteredModels, m)
}
}
return c.SendString(elements.ListModels(filteredModels, installingModels))
return c.SendString(elements.ListModels(gallery.GalleryModels(models).Search(form.Search), processingModels, galleryService))
})
/*
@ -103,7 +125,7 @@ func RegisterUIRoutes(app *fiber.App,
uid := id.String()
installingModels.Set(galleryID, uid)
processingModels.Set(galleryID, uid)
op := gallery.GalleryOp{
Id: uid,
@ -129,7 +151,7 @@ func RegisterUIRoutes(app *fiber.App,
uid := id.String()
installingModels.Set(galleryID, uid)
processingModels.Set(galleryID, uid)
op := gallery.GalleryOp{
Id: uid,
@ -174,10 +196,10 @@ func RegisterUIRoutes(app *fiber.App,
status := galleryService.GetStatus(c.Params("uid"))
galleryID := ""
for _, k := range installingModels.Keys() {
if installingModels.Get(k) == c.Params("uid") {
for _, k := range processingModels.Keys() {
if processingModels.Get(k) == c.Params("uid") {
galleryID = k
installingModels.Delete(k)
processingModels.Delete(k)
}
}

View File

@ -26,25 +26,48 @@ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
*/
function submitKey(event) {
event.preventDefault();
localStorage.setItem("key", document.getElementById("apiKey").value);
document.getElementById("apiKey").blur();
}
}
function submitSystemPrompt(event) {
event.preventDefault();
localStorage.setItem("system_prompt", document.getElementById("systemPrompt").value);
document.getElementById("systemPrompt").blur();
}
var image = "";
function submitPrompt(event) {
event.preventDefault();
const input = document.getElementById("input").value;
Alpine.store("chat").add("user", input);
Alpine.store("chat").add("user", input, image);
document.getElementById("input").value = "";
const key = localStorage.getItem("key");
const systemPrompt = localStorage.getItem("system_prompt");
promptGPT(key, input);
promptGPT(systemPrompt, key, input);
}
function readInputImage() {
if (!this.files || !this.files[0]) return;
const FR = new FileReader();
FR.addEventListener("load", function(evt) {
image = evt.target.result;
});
FR.readAsDataURL(this.files[0]);
}
async function promptGPT(key, input) {
async function promptGPT(systemPrompt, key, input) {
const model = document.getElementById("chat-model").value;
// Set class "loader" to the element with "loader" id
//document.getElementById("loader").classList.add("loader");
@ -53,6 +76,72 @@ function submitPrompt(event) {
document.getElementById("input").disabled = true;
document.getElementById('messages').scrollIntoView(false)
messages = Alpine.store("chat").messages();
// if systemPrompt isn't empty, push it at the start of messages
if (systemPrompt) {
messages.unshift({
role: "system",
content: systemPrompt
});
}
// loop all messages, and check if there are images. If there are, we need to change the content field
messages.forEach((message) => {
if (message.image) {
// The content field now becomes an array
message.content = [
{
"type": "text",
"text": message.content
}
]
message.content.push(
{
"type": "image_url",
"image_url": {
"url": message.image,
}
}
);
// remove the image field
delete message.image;
}
});
// reset the form and the image
image = "";
document.getElementById("input_image").value = null;
document.getElementById("fileName").innerHTML = "";
// if (image) {
// // take the last element content's and add the image
// last_message = messages[messages.length - 1]
// // The content field now becomes an array
// last_message.content = [
// {
// "type": "text",
// "text": last_message.content
// }
// ]
// last_message.content.push(
// {
// "type": "image_url",
// "image_url": {
// "url": image,
// }
// }
// );
// // and we replace it in the messages array
// messages[messages.length - 1] = last_message
// // reset the form and the image
// image = "";
// document.getElementById("input_image").value = null;
// document.getElementById("fileName").innerHTML = "";
// }
// Source: https://stackoverflow.com/a/75751803/11386095
const response = await fetch("/v1/chat/completions", {
method: "POST",
@ -62,7 +151,7 @@ function submitPrompt(event) {
},
body: JSON.stringify({
model: model,
messages: Alpine.store("chat").messages(),
messages: messages,
stream: true,
}),
});
@ -122,12 +211,24 @@ function submitPrompt(event) {
}
document.getElementById("key").addEventListener("submit", submitKey);
document.getElementById("system_prompt").addEventListener("submit", submitSystemPrompt);
document.getElementById("prompt").addEventListener("submit", submitPrompt);
document.getElementById("input").focus();
document.getElementById("input_image").addEventListener("change", readInputImage);
const storeKey = localStorage.getItem("key");
storeKey = localStorage.getItem("key");
if (storeKey) {
document.getElementById("apiKey").value = storeKey;
} else {
document.getElementById("apiKey").value = null;
}
storesystemPrompt = localStorage.getItem("system_prompt");
if (storesystemPrompt) {
document.getElementById("systemPrompt").value = storesystemPrompt;
} else {
document.getElementById("systemPrompt").value = null;
}
marked.setOptions({

View File

@ -72,16 +72,6 @@ body {
margin: 0.5rem;
}
.remote-code { /* Attempt to make this stand out */
outline-style: solid;
outline-color: red;
outline-width: 0.33rem;
}
.remote-code::after {
content: "\0026A0 Trust Remote Code Required \0026A0"
}
ul {
list-style-type: disc; /* Adds bullet points */
padding-left: 1.25rem; /* Indents the list from the left margin */

View File

@ -62,17 +62,34 @@ SOFTWARE.
<button @click="component = 'key'" title="Update API key"
class="m-2 float-right inline-block rounded bg-primary px-6 pb-2.5 mb-3 pt-2.5 text-xs font-medium uppercase leading-normal text-white shadow-primary-3 transition duration-150 ease-in-out hover:bg-primary-accent-300 hover:shadow-primary-2 focus:bg-primary-accent-300 focus:shadow-primary-2 focus:outline-none focus:ring-0 active:bg-primary-600 active:shadow-primary-2 dark:shadow-black/30 dark:hover:shadow-dark-strong dark:focus:shadow-dark-strong dark:active:shadow-dark-strong"
>Set API Key🔑</button>
<button @click="component = 'system_prompt'" title="System Prompt"
class="m-2 float-right inline-block rounded bg-primary px-6 pb-2.5 mb-3 pt-2.5 text-xs font-medium uppercase leading-normal text-white shadow-primary-3 transition duration-150 ease-in-out hover:bg-primary-accent-300 hover:shadow-primary-2 focus:bg-primary-accent-300 focus:shadow-primary-2 focus:outline-none focus:ring-0 active:bg-primary-600 active:shadow-primary-2 dark:shadow-black/30 dark:hover:shadow-dark-strong dark:focus:shadow-dark-strong dark:active:shadow-dark-strong"
>Set system prompt</button>
</div>
<form x-show="component === 'key'" id="key">
<input
type="password"
id="apiKey"
name="apiKey"
class="bg-gray-800 text-white border border-gray-600 focus:border-blue-500 focus:ring focus:ring-blue-500 focus:ring-opacity-50 rounded-md shadow-sm p-2 appearance-none"
placeholder="OpenAI API Key"
x-model.lazy="key"
/>
<button @click="component = 'menu'" type="submit" title="Save API key">
🔒
<i class="fa-solid fa-arrow-right"></i>
</button>
</form>
<form x-show="component === 'system_prompt'" id="system_prompt">
<textarea
type="text"
id="systemPrompt"
name="systemPrompt"
class="bg-gray-800 text-white border border-gray-600 focus:border-blue-500 focus:ring focus:ring-blue-500 focus:ring-opacity-50 rounded-md shadow-sm p-2 appearance-none"
placeholder="System prompt"
x-model.lazy="system_prompt"
></textarea>
<button @click="component = 'menu'" type="submit" title="Save Prompt">
<i class="fa-solid fa-arrow-right"></i>
</button>
</form>
@ -111,15 +128,19 @@ SOFTWARE.
<template x-if="message.role === 'assistant'">
<div class="p-2 flex-1 rounded" :class="message.role" x-html="message.html"></div>
</template>
<template x-if="message.image">
<img :src="message.image" alt="Image" class="rounded-lg mt-2 h-36 w-36">
</template>
</div>
</div>
</template>
</div>
</div>
<div class="p-4 border-t border-gray-700" x-data="{ inputValue: '', shiftPressed: false }">
<div class="p-4 border-t border-gray-700" x-data="{ inputValue: '', shiftPressed: false, fileName: '' }">
<div id="loader" class="my-2 loader" style="display: none;"></div>
<input id="chat-model" type="hidden" value="{{.Model}}">
<input id="input_image" type="file" style="display: none;" @change="fileName = $event.target.files[0].name">
<form id="prompt" action="/chat/{{.Model}}" method="get" @submit.prevent="submitPrompt">
<div class="relative w-full">
<textarea
@ -134,7 +155,10 @@ SOFTWARE.
@keydown.enter="if (!shiftPressed) { submitPrompt($event); }"
style="padding-right: 4rem;"
></textarea>
<button type=submit><i class="fa-solid fa-circle-up text-gray-300 absolute right-2 top-3 text-lg p-2 ml-2"></i></button>
<span x-text="fileName" id="fileName" class="absolute right-16 top-5 text-gray-300 text-sm mr-2"></span>
<button type="button" onclick="document.getElementById('input_image').click()" class="fa-solid fa-paperclip text-gray-300 ml-2 absolute right-10 top-3 text-lg p-2">
</button>
<button type=submit><i class="fa-solid fa-circle-up text-gray-300 absolute right-2 top-3 text-lg p-2"></i></button>
</div>
</form>
</div>
@ -146,7 +170,7 @@ SOFTWARE.
clear() {
this.history.length = 0;
},
add(role, content) {
add(role, content, image) {
const N = this.history.length - 1;
if (this.history.length && this.history[N].role === role) {
this.history[N].content += content;
@ -167,6 +191,7 @@ SOFTWARE.
role: role,
content: content,
html: c,
image: image,
});
}
@ -191,6 +216,7 @@ SOFTWARE.
return {
role: message.role,
content: message.content,
image: message.image,
};
});
},

View File

@ -10,38 +10,76 @@
<div class="container mx-auto px-4 flex-grow">
<div class="header text-center py-12">
<h1 class="text-5xl font-bold text-gray-100">Welcome to <i>your</i> LocalAI instance!</h1>
<div class="mt-6">
<!-- Logo can be uncommented and updated with a valid URL -->
</div>
<p class="mt-4 text-lg">The FOSS alternative to OpenAI, Claude, ...</p>
<a href="https://localai.io" target="_blank" class="mt-4 inline-block bg-blue-500 text-white py-2 px-4 rounded-lg shadow transition duration-300 ease-in-out hover:bg-blue-700 hover:shadow-lg">
<i class="fas fa-book-reader pr-2"></i>Documentation
</a>
</div>
<div class="models mt-12">
<div class="models mt-4">
<!-- Show in progress operations-->
{{ if .ProcessingModels }}
<h3
class="mt-4 mb-4 text-center text-3xl font-semibold text-gray-100">Operations in progress</h2>
{{end}}
{{$taskType:=.TaskTypes}}
{{ range $key,$value:=.ProcessingModels }}
{{ $op := index $taskType $key}}
{{$parts := split "@" $key}}
<div class="flex items-center justify-between bg-slate-600 p-2 mb-2 rounded-md">
<div class="flex items center">
<span class="text-gray-300"><a href="/browse?term={{$parts._1}}"
class="text-white-500 inline-block bg-blue-200 rounded-full px-3 py-1 text-sm font-semibold text-gray-700 mr-2 mb-2 hover:bg-gray-300 hover:shadow-gray-2"
>{{$parts._1}}</a> (from the '{{$parts._0}}' repository)</span>
</div>
<div hx-get="/browse/job/{{$value}}" hx-swap="innerHTML" hx-target="this" hx-trigger="done">
<h3 role="status" id="pblabel" >{{$op}}
<div hx-get="/browse/job/progress/{{$value}}" hx-trigger="every 600ms" hx-target="this"
hx-swap= "innerHTML" ></div></h3>
</div>
</div>
{{ end }}
<!-- END Show in progress operations-->
{{ if eq (len .ModelsConfig) 0 }}
<h2 class="text-center text-3xl font-semibold text-gray-100"> <i class="text-yellow-200 ml-2 fa-solid fa-triangle-exclamation animate-pulse"></i> Ouch! seems you don't have any models installed!</h2>
<p class="text-center mt-4 text-xl">..install something from the <a class="text-gray-400 hover:text-white ml-1 px-3 py-2 rounded" href="/browse">🖼️ Gallery</a> or check the <a href="https://localai.io/basics/getting_started/" class="text-gray-400 hover:text-white ml-1 px-3 py-2 rounded"> <i class="fa-solid fa-book"></i> Getting started documentation </a></p>
{{ else }}
<h2 class="text-center text-3xl font-semibold text-gray-100">Installed models</h2>
<p class="text-center mt-4 text-xl">We have {{len .ModelsConfig}} pre-loaded models available.</p>
<ul class="mt-8 space-y-4">
<table class="table-auto mt-4 w-full text-left text-gray-200">
<thead class="text-xs text-gray-400 uppercase bg-gray-700">
<tr>
<th class="px-4 py-2"></th>
<th class="px-4 py-2">Model Name</th>
<th class="px-4 py-2">Backend</th>
<th class="px-4 py-2 float-right">Actions</th>
</tr>
</thead>
<tbody>
{{$galleryConfig:=.GalleryConfig}}
{{$noicon:="https://upload.wikimedia.org/wikipedia/commons/6/65/No-Image-Placeholder.svg"}}
{{ range .ModelsConfig }}
{{ $cfg:= index $galleryConfig .Name}}
<li class="bg-gray-800 border border-gray-700 p-4 rounded-lg">
<div class="flex justify-between items-center">
<tr class="bg-gray-800 border-b border-gray-700">
<td class="px-4 py-3">
{{ with $cfg }}
<img {{ if $cfg.Icon }}
src="{{$cfg.Icon}}"
{{ else }}
src="https://upload.wikimedia.org/wikipedia/commons/6/65/No-Image-Placeholder.svg"
src="{{$noicon}}"
{{ end }}
class="rounded-t-lg max-h-24 max-w-24 object-cover mt-3"
>
<p class="font-bold text-white flex items-center"><i class="fas fa-brain pr-2"></i>{{.Name}}</p>
{{ else}}
<img src="{{$noicon}}" class="rounded-t-lg max-h-24 max-w-24 object-cover mt-3">
{{ end }}
</td>
<td class="px-4 py-3 font-bold">
<p class="font-bold text-white flex items-center"><i class="fas fa-brain pr-2"></i><a href="/browse?term={{.Name}}">{{.Name}}</a></p>
</td>
<td class="px-4 py-3 font-bold">
{{ if .Backend }}
<!-- Badge for Backend -->
<span class="inline-block bg-blue-500 text-white py-1 px-3 rounded-full text-xs">
@ -52,16 +90,20 @@
auto
</span>
{{ end }}
</td>
<td class="px-4 py-3">
<button
class="float-right inline-block rounded bg-red-800 px-6 pb-2.5 mb-3 pt-2.5 text-xs font-medium uppercase leading-normal text-white shadow-primary-3 transition duration-150 ease-in-out hover:bg-red-accent-300 hover:shadow-red-2 focus:bg-red-accent-300 focus:shadow-primary-2 focus:outline-none focus:ring-0 active:bg-red-600 active:shadow-primary-2 dark:shadow-black/30 dark:hover:shadow-dark-strong dark:focus:shadow-dark-strong dark:active:shadow-dark-strong"
data-twe-ripple-color="light" data-twe-ripple-init="" hx-confirm="Are you sure you wish to delete the model?" hx-post="/browse/delete/model/{{.Name}}" hx-swap="outerHTML"><i class="fa-solid fa-cancel pr-2"></i>Delete</button>
</div>
<!-- Additional details can go here -->
</li>
</td>
{{ end }}
</ul>
</tbody>
</table>
{{ end }}
</div>
</div>

View File

@ -63,8 +63,33 @@
{{ end }}
</div>
<span class="htmx-indicator loader"></span>
<input class="form-control appearance-none block w-full px-3 py-2 text-base font-normal text-gray-300 pb-2 mb-5 bg-gray-800 bg-clip-padding border border-solid border-gray-600 rounded transition ease-in-out m-0 focus:text-gray-300 focus:bg-gray-900 focus:border-blue-500 focus:outline-none" type="search"
<!-- Show in progress operations-->
{{ if .ProcessingModels }}
<h2
class="mt-4 mb-4 text-center text-3xl font-semibold text-gray-100">Operations in progress</h2>
{{end}}
{{$taskType:=.TaskTypes}}
{{ range $key,$value:=.ProcessingModels }}
{{ $op := index $taskType $key}}
{{$parts := split "@" $key}}
<div class="flex items-center justify-between bg-slate-600 p-2 mb-2 rounded-md">
<div class="flex items center">
<span class="text-gray-300"><a href="/browse?term={{$parts._1}}"
class="text-white-500 inline-block bg-blue-200 rounded-full px-3 py-1 text-sm font-semibold text-gray-700 mr-2 mb-2 hover:bg-gray-300 hover:shadow-gray-2"
>{{$parts._1}}</a> (from the '{{$parts._0}}' repository)</span>
</div>
<div hx-get="/browse/job/{{$value}}" hx-swap="innerHTML" hx-target="this" hx-trigger="done">
<h3 role="status" id="pblabel" >{{$op}}
<div hx-get="/browse/job/progress/{{$value}}" hx-trigger="every 600ms" hx-target="this"
hx-swap= "innerHTML" ></div></h3>
</div>
</div>
{{ end }}
<!-- END Show in progress operations-->
<input class="form-control appearance-none block w-full mt-5 px-3 py-2 text-base font-normal text-gray-300 pb-2 mb-5 bg-gray-800 bg-clip-padding border border-solid border-gray-600 rounded transition ease-in-out m-0 focus:text-gray-300 focus:bg-gray-900 focus:border-blue-500 focus:outline-none" type="search"
name="search" placeholder="Begin Typing To Search models..."
hx-post="/browse/search/models"
hx-trigger="input changed delay:500ms, search"

View File

@ -258,6 +258,22 @@
- filename: Llama-3-LewdPlay-8B-evo.q8_0.gguf
sha256: 1498152d598ff441f73ec6af9d3535875302e7251042d87feb7e71a3618966e8
uri: huggingface://Undi95/Llama-3-LewdPlay-8B-evo-GGUF/Llama-3-LewdPlay-8B-evo.q8_0.gguf
- <<: *llama3
name: "llama-3-soliloquy-8b-v2-iq-imatrix"
license: cc-by-nc-4.0
icon: https://cdn-uploads.huggingface.co/production/uploads/65d4cf2693a0a3744a27536c/u98dnnRVCwMh6YYGFIyff.png
urls:
- https://huggingface.co/Lewdiculous/Llama-3-Soliloquy-8B-v2-GGUF-IQ-Imatrix
description: |
Soliloquy-L3 is a highly capable roleplaying model designed for immersive, dynamic experiences. Trained on over 250 million tokens of roleplaying data, Soliloquy-L3 has a vast knowledge base, rich literary expression, and support for up to 24k context length. It outperforms existing ~13B models, delivering enhanced roleplaying capabilities.
overrides:
context_size: 8192
parameters:
model: Llama-3-Soliloquy-8B-v2-Q4_K_M-imat.gguf
files:
- filename: Llama-3-Soliloquy-8B-v2-Q4_K_M-imat.gguf
sha256: 3e4e066e57875c36fc3e1c1b0dba506defa5b6ed3e3e80e1f77c08773ba14dc8
uri: huggingface://Lewdiculous/Llama-3-Soliloquy-8B-v2-GGUF-IQ-Imatrix/Llama-3-Soliloquy-8B-v2-Q4_K_M-imat.gguf
- <<: *llama3
name: "chaos-rp_l3_b-iq-imatrix"
urls:

View File

@ -1,6 +1,9 @@
package gallery
import "fmt"
import (
"fmt"
"strings"
)
// GalleryModel is the struct used to represent a model in the gallery returned by the endpoint.
// It is used to install the model by resolving the URL and downloading the files.
@ -28,3 +31,19 @@ type GalleryModel struct {
func (m GalleryModel) ID() string {
return fmt.Sprintf("%s@%s", m.Gallery.Name, m.Name)
}
type GalleryModels []*GalleryModel
func (gm GalleryModels) Search(term string) GalleryModels {
var filteredModels GalleryModels
for _, m := range gm {
if strings.Contains(m.Name, term) ||
strings.Contains(m.Description, term) ||
strings.Contains(m.Gallery.Name, term) ||
strings.Contains(strings.Join(m.Tags, ","), term) {
filteredModels = append(filteredModels, m)
}
}
return filteredModels
}

View File

@ -15,6 +15,12 @@ func NewSyncedMap[K comparable, V any]() *SyncedMap[K, V] {
}
}
func (m *SyncedMap[K, V]) Map() map[K]V {
m.mu.RLock()
defer m.mu.RUnlock()
return m.m
}
func (m *SyncedMap[K, V]) Get(key K) V {
m.mu.RLock()
defer m.mu.RUnlock()