Commit Graph

162 Commits

Author SHA1 Message Date
AUTOMATIC1111
cb84a304f0
Merge pull request #4273 from Omegastick/ordered_hypernetworks
Sort hypernetworks list
2022-11-05 16:16:18 +03:00
Muhammad Rizqi Nur
bb832d7725 Simplify grad clip 2022-11-05 11:48:38 +07:00
Isaac Poulton
08feb4c364
Sort straight out of the glob 2022-11-04 20:53:11 +07:00
Muhammad Rizqi Nur
3277f90e93 Merge branch 'master' into gradient-clipping 2022-11-04 18:47:28 +07:00
Isaac Poulton
fd62727893
Sort hypernetworks 2022-11-04 18:34:35 +07:00
Fampai
39541d7725 Fixes race condition in training when VAE is unloaded
set_current_image can attempt to use the VAE when it is unloaded to
the CPU while training
2022-11-04 04:50:22 -04:00
aria1th
1ca0bcd3a7 only save if option is enabled 2022-11-04 16:09:19 +09:00
aria1th
f5d394214d split before declaring file name 2022-11-04 16:04:03 +09:00
aria1th
283249d239 apply 2022-11-04 15:57:17 +09:00
AngelBottomless
179702adc4
Merge branch 'AUTOMATIC1111:master' into force-push-patch-13 2022-11-04 15:51:09 +09:00
AngelBottomless
0d07cbfa15
I blame code autocomplete 2022-11-04 15:50:54 +09:00
aria1th
0abb39f461 resolve conflict - first revert 2022-11-04 15:47:19 +09:00
AUTOMATIC1111
4918eb6ce4
Merge branch 'master' into hn-activation 2022-11-04 09:02:15 +03:00
aria1th
1764ac3c8b use hash to check valid optim 2022-11-03 14:49:26 +09:00
aria1th
0b143c1163 Separate .optim file from model 2022-11-03 14:30:53 +09:00
Muhammad Rizqi Nur
d5ea878b2a Fix merge conflicts 2022-10-31 13:54:40 +07:00
Muhammad Rizqi Nur
4123be632a Fix merge conflicts 2022-10-31 13:53:22 +07:00
Muhammad Rizqi Nur
cd4d59c0de Merge master 2022-10-30 18:57:51 +07:00
aria1th
9d96d7d0a0 resolve conflicts 2022-10-30 20:40:59 +09:00
AngelBottomless
20194fd975 We have duplicate linear now 2022-10-30 20:40:59 +09:00
AUTOMATIC1111
17a2076f72
Merge pull request #3928 from R-N/validate-before-load
Optimize training a little
2022-10-30 09:51:36 +03:00
Muhammad Rizqi Nur
3d58510f21 Fix dataset still being loaded even when training will be skipped 2022-10-30 00:54:59 +07:00
Muhammad Rizqi Nur
a07f054c86 Add missing info on hypernetwork/embedding model log
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513

Also group the saving into one
2022-10-30 00:49:29 +07:00
Muhammad Rizqi Nur
ab05a74ead Revert "Add cleanup after training"
This reverts commit 3ce2bfdf95.
2022-10-30 00:32:02 +07:00
Muhammad Rizqi Nur
3ce2bfdf95 Add cleanup after training 2022-10-29 19:43:21 +07:00
Muhammad Rizqi Nur
ab27c111d0 Add input validations before loading dataset for training 2022-10-29 18:09:17 +07:00
Muhammad Rizqi Nur
05e2e40537 Merge branch 'master' into gradient-clipping 2022-10-29 15:04:21 +07:00
timntorres
e98f72be33
Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-info 2022-10-29 00:31:23 -07:00
AUTOMATIC1111
810e6a407d
Merge pull request #3858 from R-N/log-csv
Fix log off by 1 #3847
2022-10-29 07:55:20 +03:00
AUTOMATIC1111
d3b4b9d7ec
Merge pull request #3717 from benkyoujouzu/master
Add missing support for linear activation in hypernetwork
2022-10-29 07:30:14 +03:00
AngelBottomless
f361e804eb
Re enable linear 2022-10-29 08:36:50 +09:00
Muhammad Rizqi Nur
9ceef81f77 Fix log off by 1 2022-10-28 20:48:08 +07:00
Muhammad Rizqi Nur
16451ca573 Learning rate sched syntax support for grad clipping 2022-10-28 17:16:23 +07:00
timntorres
db5a354c48 Always ignore "None.pt" in the hypernet directory. 2022-10-28 01:41:57 -07:00
benkyoujouzu
b2a8b263b2 Add missing support for linear activation in hypernetwork 2022-10-28 12:54:59 +08:00
Muhammad Rizqi Nur
2a25729623 Gradient clipping in train tab 2022-10-28 09:44:56 +07:00
AngelBottomless
462e6ba667
Disable unavailable or duplicate options 2022-10-27 15:40:24 +09:00
AngelBottomless
029d7c7543
Revert unresolved changes in Bias initialization
it should be zeros_ or parameterized in future properly.
2022-10-27 14:44:53 +09:00
guaneec
cc56df996e Fix dropout logic 2022-10-27 14:38:21 +09:00
AngelBottomless
85fcccc105 Squashed commit of fixing dropout silently
fix dropouts for future hypernetworks

add kwargs for Hypernetwork class

hypernet UI for gradio input

add recommended options

remove as options

revert adding options in ui
2022-10-27 14:38:21 +09:00
guaneec
b6a8bb123b
Fix merge 2022-10-26 15:15:19 +08:00
timntorres
a524d137d0 patch bug (SeverianVoid's comment on 5245c7a) 2022-10-26 10:12:46 +03:00
guaneec
91bb35b1e6
Merge fix 2022-10-26 15:00:03 +08:00
guaneec
649d79a8ec
Merge branch 'master' into hn-activation 2022-10-26 14:58:04 +08:00
guaneec
877d94f97c
Back compatibility 2022-10-26 14:50:58 +08:00
AngelBottomless
7207e3bf49 remove duplicate keys and lowercase 2022-10-26 09:17:01 +03:00
AngelBottomless
de096d0ce7 Weight initialization and More activation func
add weight init

add weight init option in create_hypernetwork

fstringify hypernet info

save weight initialization info for further debugging

fill bias with zero for He/Xavier

initialize LayerNorm with Normal

fix loading weight_init
2022-10-26 09:17:01 +03:00
guaneec
c702d4d0df
Fix off-by-one 2022-10-26 13:43:04 +08:00
guaneec
2f4c91894d
Remove activation from final layer of HNs 2022-10-26 12:10:30 +08:00
Melan
18f86e41f6 Removed two unused imports 2022-10-24 17:21:18 +02:00
AngelBottomless
e9a410b535 check length for variance 2022-10-24 09:07:39 +03:00
AngelBottomless
0d2e1dac40 convert deque -> list
I don't feel this being efficient
2022-10-24 09:07:39 +03:00
AngelBottomless
348f89c8d4 statistics for pbar 2022-10-24 09:07:39 +03:00
AngelBottomless
40b56c9289 cleanup some code 2022-10-24 09:07:39 +03:00
AngelBottomless
b297cc3324 Hypernetworks - fix KeyError in statistics caching
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
2022-10-24 09:07:39 +03:00
DepFA
1fbfc052eb Update hypernetwork.py 2022-10-23 08:34:33 +03:00
AngelBottomless
48dbf99e84 Allow tracking real-time loss
Someone had 6000 images in their dataset, and it was shown as 0, which was confusing.
This will allow tracking real time dataset-average loss for registered objects.
2022-10-22 22:24:19 +03:00
AngelBottomless
24694e5983 Update hypernetwork.py 2022-10-22 20:25:32 +03:00
discus0434
6a4fa73a38 small fix 2022-10-22 13:44:39 +00:00
discus0434
97749b7c7d
Merge branch 'AUTOMATIC1111:master' into master 2022-10-22 22:00:59 +09:00
discus0434
7912acef72 small fix 2022-10-22 13:00:44 +00:00
discus0434
fccba4729d add an option to avoid dying relu 2022-10-22 12:02:41 +00:00
AUTOMATIC
7fd90128eb added a guard for hypernet training that will stop early if weights are getting no gradients 2022-10-22 14:48:43 +03:00
discus0434
dcb45dfecf Merge branch 'master' of upstream 2022-10-22 11:14:46 +00:00
discus0434
0e8ca8e7af add dropout 2022-10-22 11:07:00 +00:00
timntorres
272fa527bb Remove unused variable. 2022-10-21 16:52:24 +03:00
timntorres
19818f023c Match hypernet name with filename in all cases. 2022-10-21 16:52:24 +03:00
timntorres
51e3dc9cca Sanitize hypernet name input. 2022-10-21 16:52:24 +03:00
AUTOMATIC
03a1e288c4 turns out LayerNorm also has weight and bias and needs to be pre-multiplied and trained for hypernets 2022-10-21 10:13:24 +03:00
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text 2022-10-21 09:57:55 +03:00
timntorres
4ff274e1e3 Revise comments. 2022-10-21 09:55:00 +03:00
timntorres
5245c7a493 Issue #2921-Give PNG info to Hypernet previews. 2022-10-21 09:55:00 +03:00
AUTOMATIC
c23f666dba a more strict check for activation type and a more reasonable check for type of layer in hypernets 2022-10-21 09:47:43 +03:00
Melan
7543cf5e3b Fixed some typos in the code 2022-10-20 22:43:08 +02:00
Melan
8f59129847 Some changes to the tensorboard code and hypernetwork support 2022-10-20 22:37:16 +02:00
aria1th
f89829ec3a Revert "fix bugs and optimizations"
This reverts commit 108be15500.
2022-10-21 01:37:11 +09:00
AngelBottomless
108be15500
fix bugs and optimizations 2022-10-21 01:00:41 +09:00
AngelBottomless
a71e021236
only linear 2022-10-20 23:48:52 +09:00
AngelBottomless
d8acd34f66
generalized some functions and option for ignoring first layer 2022-10-20 23:43:03 +09:00
discus0434
6b38c2c19c
Merge branch 'AUTOMATIC1111:master' into master 2022-10-20 18:51:12 +09:00
AUTOMATIC
930b4c64f7 allow float sizes for hypernet's layer_structure 2022-10-20 08:18:02 +03:00
discus0434
6f98e89486 update 2022-10-20 00:10:45 +00:00
DepFA
166be3919b
allow overwrite old hn 2022-10-20 00:09:40 +01:00
DepFA
d6ea584137
change html output 2022-10-20 00:07:57 +01:00
discus0434
2ce52d32e4 fix for #3086 failing to load any previous hypernet 2022-10-19 16:31:12 +00:00
AUTOMATIC
c6e9fed500 fix for #3086 failing to load any previous hypernet 2022-10-19 19:21:16 +03:00
discus0434
3770b8d2fa enable to write layer structure of hn himself 2022-10-19 15:28:42 +00:00
discus0434
42fbda83bb layer options moves into create hnet ui 2022-10-19 14:30:33 +00:00
discus0434
7f8670c4ef
Merge branch 'master' into master 2022-10-19 15:18:45 +09:00
Silent
da72becb13 Use training width/height when training hypernetworks. 2022-10-19 09:13:28 +03:00
discus0434
e40ba281f1 update 2022-10-19 01:03:58 +09:00
discus0434
a5611ea502 update 2022-10-19 01:00:01 +09:00
discus0434
6021f7a75f add options to custom hypernetwork layer structure 2022-10-19 00:51:36 +09:00
AngelBottomless
703e6d9e4e check NaN for hypernetwork tuning 2022-10-15 17:15:26 +03:00
AUTOMATIC
c7a86f7fe9 add option to use batch size for training 2022-10-15 09:24:59 +03:00
AUTOMATIC
03d62538ae remove duplicate code for log loss, add step, make it read from options rather than gradio input 2022-10-14 22:43:55 +03:00
AUTOMATIC
326fe7d44b Merge remote-tracking branch 'Melanpan/master' 2022-10-14 22:14:50 +03:00
AUTOMATIC
c344ba3b32 add option to read generation params for learning previews from txt2img 2022-10-14 20:31:49 +03:00
AUTOMATIC
354ef0da3b add hypernetwork multipliers 2022-10-13 20:12:37 +03:00
Melan
8636b50aea Add learn_rate to csv and removed a left-over debug statement 2022-10-13 12:37:58 +02:00