C43H66N12O12S2
3e7a981194
remove functorch
2022-10-10 19:54:07 +03:00
Fampai
122d42687b
Fix VRAM Issue by only loading in hypernetwork when selected in settings
2022-10-09 11:08:11 +03:00
AUTOMATIC
e6e42f98df
make --force-enable-xformers work without needing --xformers
2022-10-08 22:12:23 +03:00
AUTOMATIC
f9c5da1592
add fallback for xformers_attnblock_forward
2022-10-08 19:05:19 +03:00
AUTOMATIC
dc1117233e
simplify xfrmers options: --xformers to enable and that's it
2022-10-08 17:02:18 +03:00
AUTOMATIC
7ff1170a2e
emergency fix for xformers (continue + shared)
2022-10-08 16:33:39 +03:00
AUTOMATIC1111
48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
...
xformers attention
2022-10-08 16:29:59 +03:00
C43H66N12O12S2
69d0053583
update sd_hijack_opt to respect new env variables
2022-10-08 16:21:40 +03:00
C43H66N12O12S2
76a616fa6b
Update sd_hijack_optimizations.py
2022-10-08 11:55:38 +03:00
C43H66N12O12S2
5d54f35c58
add xformers attnblock and hypernetwork support
2022-10-08 11:55:02 +03:00
brkirch
f2055cb1d4
Add hypernetwork support to split cross attention v1
...
* Add hypernetwork support to split_cross_attention_forward_v1
* Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
2022-10-08 09:39:17 +03:00
C43H66N12O12S2
c9cc65b201
switch to the proper way of calling xformers
2022-10-08 04:09:18 +03:00
AUTOMATIC
bad7cb29ce
added support for hypernetworks (???)
2022-10-07 10:17:52 +03:00
C43H66N12O12S2
f174fb2922
add xformers attention
2022-10-07 05:21:49 +03:00
Jairo Correa
ad0cc85d1f
Merge branch 'master' into stable
2022-10-02 18:31:19 -03:00
AUTOMATIC
820f1dc96b
initial support for training textual inversion
2022-10-02 15:03:39 +03:00