v0xie
|
eb667e715a
|
feat: LyCORIS/kohya OFT network support
|
2023-11-15 18:28:48 -08:00 |
|
v0xie
|
d6d0b22e66
|
fix: ignore calc_scale() for COFT which has very small alpha
|
2023-11-15 03:08:50 -08:00 |
|
v0xie
|
bbf00a96af
|
refactor: remove unused function
|
2023-11-04 14:56:47 -07:00 |
|
v0xie
|
329c8bacce
|
refactor: use same updown for both kohya OFT and LyCORIS diag-oft
|
2023-11-04 14:54:36 -07:00 |
|
v0xie
|
f6c8201e56
|
refactor: move factorization to lyco_helpers, separate calc_updown for kohya and kb
|
2023-11-03 19:35:15 -07:00 |
|
v0xie
|
fe1967a4c4
|
skip multihead attn for now
|
2023-11-03 17:52:55 -07:00 |
|
v0xie
|
d727ddfccd
|
no idea what i'm doing, trying to support both type of OFT, kblueleaf diag_oft has MultiheadAttn which kohya's doesn't?, attempt create new module based off network_lora.py, errors about tensor dim mismatch
|
2023-11-02 00:13:11 -07:00 |
|
v0xie
|
65ccd6305f
|
detect diag_oft type
|
2023-11-02 00:11:32 -07:00 |
|
v0xie
|
a2fad6ee05
|
test implementation based on kohaku diag-oft implementation
|
2023-11-01 22:34:27 -07:00 |
|
v0xie
|
6523edb8a4
|
style: conform style
|
2023-10-22 09:31:15 -07:00 |
|
v0xie
|
3b8515d2c9
|
fix: multiplier applied twice in finalize_updown
|
2023-10-22 09:27:48 -07:00 |
|
v0xie
|
4a50c9638c
|
refactor: remove used OFT functions
|
2023-10-22 08:54:24 -07:00 |
|
v0xie
|
de8ee92ed8
|
fix: use merge_weight to cache value
|
2023-10-21 17:37:17 -07:00 |
|
v0xie
|
76f5abdbdb
|
style: cleanup oft
|
2023-10-21 16:07:45 -07:00 |
|
v0xie
|
fce86ab7d7
|
fix: support multiplier, no forward pass hook
|
2023-10-21 16:03:54 -07:00 |
|
v0xie
|
7683547728
|
fix: return orig weights during updown, merge weights before forward
|
2023-10-21 14:42:24 -07:00 |
|
v0xie
|
2d8c894b27
|
refactor: use forward hook instead of custom forward
|
2023-10-21 13:43:31 -07:00 |
|
v0xie
|
0550659ce6
|
style: fix ambiguous variable name
|
2023-10-19 13:13:02 -07:00 |
|
v0xie
|
d10c4db57e
|
style: formatting
|
2023-10-19 12:52:14 -07:00 |
|
v0xie
|
321680ccd0
|
refactor: fix constraint, re-use get_weight
|
2023-10-19 12:41:17 -07:00 |
|
v0xie
|
eb01d7f0e0
|
faster by calculating R in updown and using cached R in forward
|
2023-10-18 04:56:53 -07:00 |
|
v0xie
|
853e21d98e
|
faster by using cached R in forward
|
2023-10-18 04:27:44 -07:00 |
|
v0xie
|
1c6efdbba7
|
inference working but SLOW
|
2023-10-18 04:16:01 -07:00 |
|
v0xie
|
ec718f76b5
|
wip incorrect OFT implementation
|
2023-10-17 23:35:50 -07:00 |
|
AUTOMATIC1111
|
4be7b620c2
|
Merge pull request #13568 from AUTOMATIC1111/lora_emb_bundle
Add lora-embedding bundle system
|
2023-10-14 12:18:55 +03:00 |
|
AUTOMATIC1111
|
a8cbe50c9f
|
remove duplicated code
|
2023-10-14 12:17:59 +03:00 |
|
v0xie
|
906d1179e9
|
support inference with LyCORIS GLora networks
|
2023-10-11 21:26:58 -07:00 |
|
Kohaku-Blueleaf
|
891ccb767c
|
Fix lint
|
2023-10-10 15:07:25 +08:00 |
|
Kohaku-Blueleaf
|
81e94de318
|
Add warning when meet emb name conflicting
Choose standalone embedding (in /embeddings folder) first
|
2023-10-10 14:44:20 +08:00 |
|
Kohaku-Blueleaf
|
2282eb8dd5
|
Remove dev debug print
|
2023-10-10 12:11:00 +08:00 |
|
Kohaku-Blueleaf
|
3d8b1af6be
|
Support string_to_param nested dict
format:
bundle_emb.EMBNAME.string_to_param.KEYNAME
|
2023-10-10 12:09:33 +08:00 |
|
Kohaku-Blueleaf
|
2aa485b5af
|
add lora bundle system
|
2023-10-09 22:52:09 +08:00 |
|
dongwenpu
|
7d4d871d46
|
fix: lora-bias-backup don't reset cache
|
2023-09-10 17:53:42 +08:00 |
|
bluelovers
|
d83a1ba65b
|
feat: display file metadata ss_output_name
https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/12289
|
2023-08-29 06:33:00 +08:00 |
|
AUTOMATIC1111
|
86221269f9
|
RAM optimization round 2
|
2023-08-16 09:55:35 +03:00 |
|
AUTOMATIC1111
|
85fcb7b8df
|
lint
|
2023-08-15 19:25:03 +03:00 |
|
AUTOMATIC1111
|
8b181c812f
|
Merge pull request #12584 from AUTOMATIC1111/full-module-with-bias
Add ex_bias into full module
|
2023-08-15 19:24:15 +03:00 |
|
AUTOMATIC1111
|
f01682ee01
|
store patches for Lora in a specialized module
|
2023-08-15 19:23:40 +03:00 |
|
Kohaku-Blueleaf
|
aa57a89a21
|
full module with ex_bias
|
2023-08-15 23:41:46 +08:00 |
|
Kohaku-Blueleaf
|
f70ded8936
|
remove "if bias exist" check
|
2023-08-14 13:53:40 +08:00 |
|
Kohaku-Blueleaf
|
e7c03ccdce
|
Merge branch 'dev' into extra-norm-module
|
2023-08-14 13:34:51 +08:00 |
|
Kohaku-Blueleaf
|
d9cc27cb29
|
Fix MHA updown err and support ex-bias for no-bias layer
|
2023-08-14 13:32:51 +08:00 |
|
AUTOMATIC1111
|
1c6ca09992
|
Merge pull request #12510 from catboxanon/feat/extnet/hashes
Support search and display of hashes for all extra network items
|
2023-08-13 16:46:32 +03:00 |
|
AUTOMATIC1111
|
db40d26d08
|
linter
|
2023-08-13 16:38:10 +03:00 |
|
AUTOMATIC1111
|
d8419762c1
|
Lora: output warnings in UI rather than fail for unfitting loras; switch to logging for error output in console
|
2023-08-13 15:07:37 +03:00 |
|
catboxanon
|
7fa5ee54b1
|
Support search and display of hashes for all extra network items
|
2023-08-13 02:32:54 -04:00 |
|
Kohaku-Blueleaf
|
5881dcb887
|
remove debug print
|
2023-08-13 02:36:02 +08:00 |
|
Kohaku-Blueleaf
|
a2b8305096
|
return None if no ex_bias
|
2023-08-13 02:35:04 +08:00 |
|
Kohaku-Blueleaf
|
bd4da4474b
|
Add extra norm module into built-in lora ext
refer to LyCORIS 1.9.0.dev6
add new option and module for training norm layer
(Which is reported to be good for style)
|
2023-08-13 02:27:39 +08:00 |
|
catboxanon
|
4fafc34e49
|
Fix to make LoRA old method setting work
|
2023-08-10 23:42:58 -04:00 |
|