LocalAI/backend/python/mamba
Ettore Di Giacinto cb7512734d
transformers: correctly load automodels (#1643)
* backends(transformers): use AutoModel with LLM types

* examples: animagine-xl

* Add codellama examples
2024-01-26 00:13:21 +01:00
..
backend_mamba.py feat(extra-backends): Improvements, adding mamba example (#1618) 2024-01-20 17:56:08 +01:00
backend_pb2_grpc.py feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
backend_pb2.py transformers: correctly load automodels (#1643) 2024-01-26 00:13:21 +01:00
install.sh feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
Makefile feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
README.md feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
run.sh feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
test_backend_mamba.py feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00
test.sh feat: 🐍 add mamba support (#1589) 2024-01-19 23:42:50 +01:00

Creating a separate environment for the mamba project

make mamba