LocalAI/backend/python
Ettore Di Giacinto 5d1018495f
feat(intel): add diffusers/transformers support (#1746)
* feat(intel): add diffusers support

* try to consume upstream container image

* Debug

* Manually install deps

* Map transformers/hf cache dir to modelpath if not specified

* fix(compel): update initialization, pass by all gRPC options

* fix: add dependencies, implement transformers for xpu

* base it from the oneapi image

* Add pillow

* set threads if specified when launching the API

* Skip conda install if intel

* defaults to non-intel

* ci: add to pipelines

* prepare compel only if enabled

* Skip conda install if intel

* fix cleanup

* Disable compel by default

* Install torch 2.1.0 with Intel

* Skip conda on some setups

* Detect python

* Quiet output

* Do not override system python with conda

* Prefer python3

* Fixups

* exllama2: do not install without conda (overrides pytorch version)

* exllama/exllama2: do not install if not using cuda

* Add missing dataset dependency

* Small fixups, symlink to python, add requirements

* Add neural_speed to the deps

* correctly handle model offloading

* fix: device_map == xpu

* go back at calling python, fixed at dockerfile level

* Exllama2 restricted to only nvidia gpus

* Tokenizer to xpu
2024-03-07 14:37:45 +01:00
..
autogptq feat(autogpt/transformers): consume trust_remote_code (#1799) 2024-03-05 19:47:15 +01:00
bark Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
common-env/transformers feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
coqui Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
diffusers feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
exllama feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
exllama2 feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
mamba feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
petals feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
sentencetransformers Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
transformers feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
transformers-musicgen Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
vall-e-x feat(intel): add diffusers/transformers support (#1746) 2024-03-07 14:37:45 +01:00
vllm Bump vLLM version + more options when loading models in vLLM (#1782) 2024-03-01 22:48:53 +01:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00

Common commands about conda environment

Create a new empty conda environment

conda create --name <env-name> python=<your version> -y

conda create --name autogptq python=3.11 -y

To activate the environment

As of conda 4.4

conda activate autogptq

The conda version older than 4.4

source activate autogptq

Install the packages to your environment

Sometimes you need to install the packages from the conda-forge channel

By using conda

conda install <your-package-name>

conda install -c conda-forge <your package-name>

Or by using pip

pip install <your-package-name>