AUTOMATIC1111
6f0abbb71a
textual inversion support for SDXL
2023-07-29 15:15:06 +03:00
AUTOMATIC1111
5677296d1b
Merge pull request #11878 from Bourne-M/patch-1
...
【bug】reload altclip model error
2023-07-19 16:26:12 +03:00
yfzhou
cb75734896
【bug】reload altclip model error
...
When using BertSeriesModelWithTransformation as the cond_stage_model, the undo_hijack should be performed using the FrozenXLMREmbedderWithCustomWords type; otherwise, it will result in a failed model reload.
2023-07-19 17:53:28 +08:00
AUTOMATIC1111
0198eaec45
Merge pull request #11757 from AUTOMATIC1111/sdxl
...
SD XL support
2023-07-16 12:04:53 +03:00
AUTOMATIC1111
2b1bae0d75
add textual inversion hashes to infotext
2023-07-15 08:41:22 +03:00
AUTOMATIC1111
6d8dcdefa0
initial SDXL refiner support
2023-07-14 09:16:01 +03:00
AUTOMATIC1111
594c8e7b26
fix CLIP doing the unneeded normalization
...
revert SD2.1 back to use the original repo
add SDXL's force_zero_embeddings to negative prompt
2023-07-13 11:35:52 +03:00
AUTOMATIC1111
da464a3fb3
SDXL support
2023-07-12 23:52:43 +03:00
AUTOMATIC1111
af081211ee
getting SD2.1 to run on SDXL repo
2023-07-11 21:16:43 +03:00
AUTOMATIC
36888092af
revert default cross attention optimization to Doggettx
...
make --disable-opt-split-attention command line option work again
2023-06-01 08:12:06 +03:00
AUTOMATIC
339b531570
custom unet support
2023-05-27 15:47:33 +03:00
AUTOMATIC
a6e653be26
possible fix for empty list of optimizations #10605
2023-05-23 18:49:15 +03:00
AUTOMATIC
2140bd1c10
make it actually work after suggestions
2023-05-19 10:05:07 +03:00
AUTOMATIC
8a3d232839
fix linter issues
2023-05-19 00:03:27 +03:00
AUTOMATIC
2582a0fd3b
make it possible for scripts to add cross attention optimizations
...
add UI selection for cross attention optimization
2023-05-18 22:48:28 +03:00
AUTOMATIC
1a43524018
fix model loading twice in some situations
2023-05-14 13:27:50 +03:00
Aarni Koskela
49a55b410b
Autofix Ruff W (not W605) (mostly whitespace)
2023-05-11 20:29:11 +03:00
AUTOMATIC
028d3f6425
ruff auto fixes
2023-05-10 11:05:02 +03:00
AUTOMATIC
f741a98bac
imports cleanup for ruff
2023-05-10 08:43:42 +03:00
AUTOMATIC
762265eab5
autofixes from ruff
2023-05-10 07:52:45 +03:00
Pam
8d7fa2f67c
sdp_attnblock_forward hijack
2023-03-10 22:48:41 +05:00
Pam
0981dea948
sdp refactoring
2023-03-10 12:58:10 +05:00
Pam
37acba2633
argument to disable memory efficient for sdp
2023-03-10 12:19:36 +05:00
Pam
fec0a89511
scaled dot product attention
2023-03-07 00:33:13 +05:00
AUTOMATIC1111
dfb3b8f398
Merge branch 'master' into weighted-learning
2023-02-19 12:41:29 +03:00
Shondoit
c4bfd20f31
Hijack to add weighted_forward to model: return loss * weight map
2023-02-15 10:03:59 +01:00
brkirch
2016733814
Apply hijacks in ddpm_edit for upcast sampling
...
To avoid import errors, ddpm_edit hijacks are done after an instruct pix2pix model is loaded.
2023-02-07 22:53:45 -05:00
AUTOMATIC1111
fecb990deb
Merge pull request #7309 from brkirch/fix-embeddings
...
Fix embeddings, upscalers, and refactor `--upcast-sampling`
2023-01-28 18:44:36 +03:00
AUTOMATIC
d04e3e921e
automatically detect v-parameterization for SD2 checkpoints
2023-01-28 15:24:41 +03:00
brkirch
ada17dbd7c
Refactor conditional casting, fix upscalers
2023-01-28 04:16:25 -05:00
brkirch
c4b9b07db6
Fix embeddings dtype mismatch
2023-01-26 09:00:15 -05:00
AUTOMATIC
6073456c83
write a comment for fix_checkpoint function
2023-01-19 20:39:10 +03:00
AUTOMATIC
924e222004
add option to show/hide warnings
...
removed hiding warnings from LDSR
fixed/reworked few places that produced warnings
2023-01-18 23:04:24 +03:00
AUTOMATIC
085427de0e
make it possible for extensions/scripts to add their own embedding directories
2023-01-08 09:37:33 +03:00
AUTOMATIC1111
c295e4a244
Merge pull request #6055 from brkirch/sub-quad_attn_opt
...
Add Birch-san's sub-quadratic attention implementation
2023-01-07 12:26:55 +03:00
AUTOMATIC
79e39fae61
CLIP hijack rework
2023-01-07 01:46:13 +03:00
brkirch
5deb2a19cc
Allow Doggettx's cross attention opt without CUDA
2023-01-06 01:33:15 -05:00
brkirch
3bfe2bb549
Merge remote-tracking branch 'upstream/master' into sub-quad_attn_opt
2023-01-06 00:15:22 -05:00
brkirch
f6ab5a39d7
Merge branch 'AUTOMATIC1111:master' into sub-quad_attn_opt
2023-01-06 00:14:20 -05:00
brkirch
d782a95967
Add Birch-san's sub-quadratic attention implementation
2023-01-06 00:14:13 -05:00
Vladimir Mandic
21ee77db31
add cross-attention info
2023-01-04 08:04:38 -05:00
AUTOMATIC
f34c734172
alt-diffusion integration
2022-12-31 18:06:35 +03:00
AUTOMATIC
3f401cdb64
Merge remote-tracking branch 'baai-open-internal/master' into alt-diffusion
2022-12-31 13:02:28 +03:00
AUTOMATIC
505ec7e4d9
cleanup some unneeded imports for hijack files
2022-12-10 09:17:39 +03:00
AUTOMATIC
7dbfd8a7d8
do not replace entire unet for the resolution hack
2022-12-10 09:14:45 +03:00
AUTOMATIC1111
2641d1b83b
Merge pull request #4978 from aliencaocao/support_any_resolution
...
Patch UNet Forward to support resolutions that are not multiples of 64
2022-12-10 08:45:41 +03:00
zhaohu xing
5dcc22606d
add hash and fix undo hijack bug
...
Signed-off-by: zhaohu xing <920232796@qq.com>
2022-12-06 16:04:50 +08:00
Zac Liu
a25dfebeed
Merge pull request #3 from 920232796/master
...
fix device support for mps
update the support for SD2.0
2022-12-06 09:17:57 +08:00
Zac Liu
3ebf977a6e
Merge branch 'AUTOMATIC1111:master' into master
2022-12-06 09:16:15 +08:00
zhaohu xing
4929503258
fix bugs
...
Signed-off-by: zhaohu xing <920232796@qq.com>
2022-12-06 09:03:55 +08:00