LocalAI/backend/python/common-env/transformers
fenfir fb0a4c5d9a
Build docker container for ROCm (#1595)
* Dockerfile changes to build for ROCm

* Adjust linker flags for ROCm

* Update conda env for diffusers and transformers to use ROCm pytorch

* Update transformers conda env for ROCm

* ci: build hipblas images

* fixup rebase

* use self-hosted

Signed-off-by: mudler <mudler@localai.io>

* specify LD_LIBRARY_PATH only when BUILD_TYPE=hipblas

---------

Signed-off-by: mudler <mudler@localai.io>
Co-authored-by: mudler <mudler@localai.io>
2024-02-16 15:08:50 +01:00
..
install.sh feat: more embedded models, coqui fixes, add model usage and description (#1556) 2024-01-08 00:37:02 +01:00
Makefile Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
transformers-nvidia.yml deps(conda): use transformers environment with autogptq (#1555) 2024-01-06 15:30:53 +01:00
transformers-rocm.yml Build docker container for ROCm (#1595) 2024-02-16 15:08:50 +01:00
transformers.yml deps(conda): use transformers environment with autogptq (#1555) 2024-01-06 15:30:53 +01:00