Muhammad Rizqi Nur
d5ea878b2a
Fix merge conflicts
2022-10-31 13:54:40 +07:00
Muhammad Rizqi Nur
4123be632a
Fix merge conflicts
2022-10-31 13:53:22 +07:00
Muhammad Rizqi Nur
cd4d59c0de
Merge master
2022-10-30 18:57:51 +07:00
AUTOMATIC1111
17a2076f72
Merge pull request #3928 from R-N/validate-before-load
...
Optimize training a little
2022-10-30 09:51:36 +03:00
Muhammad Rizqi Nur
3d58510f21
Fix dataset still being loaded even when training will be skipped
2022-10-30 00:54:59 +07:00
Muhammad Rizqi Nur
a07f054c86
Add missing info on hypernetwork/embedding model log
...
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513
Also group the saving into one
2022-10-30 00:49:29 +07:00
Muhammad Rizqi Nur
ab05a74ead
Revert "Add cleanup after training"
...
This reverts commit 3ce2bfdf95
.
2022-10-30 00:32:02 +07:00
Muhammad Rizqi Nur
3ce2bfdf95
Add cleanup after training
2022-10-29 19:43:21 +07:00
Muhammad Rizqi Nur
ab27c111d0
Add input validations before loading dataset for training
2022-10-29 18:09:17 +07:00
Muhammad Rizqi Nur
05e2e40537
Merge branch 'master' into gradient-clipping
2022-10-29 15:04:21 +07:00
timntorres
e98f72be33
Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-info
2022-10-29 00:31:23 -07:00
AUTOMATIC1111
810e6a407d
Merge pull request #3858 from R-N/log-csv
...
Fix log off by 1 #3847
2022-10-29 07:55:20 +03:00
AUTOMATIC1111
d3b4b9d7ec
Merge pull request #3717 from benkyoujouzu/master
...
Add missing support for linear activation in hypernetwork
2022-10-29 07:30:14 +03:00
AngelBottomless
f361e804eb
Re enable linear
2022-10-29 08:36:50 +09:00
Muhammad Rizqi Nur
9ceef81f77
Fix log off by 1
2022-10-28 20:48:08 +07:00
Muhammad Rizqi Nur
16451ca573
Learning rate sched syntax support for grad clipping
2022-10-28 17:16:23 +07:00
timntorres
db5a354c48
Always ignore "None.pt" in the hypernet directory.
2022-10-28 01:41:57 -07:00
benkyoujouzu
b2a8b263b2
Add missing support for linear activation in hypernetwork
2022-10-28 12:54:59 +08:00
Muhammad Rizqi Nur
2a25729623
Gradient clipping in train tab
2022-10-28 09:44:56 +07:00
AngelBottomless
462e6ba667
Disable unavailable or duplicate options
2022-10-27 15:40:24 +09:00
timntorres
a524d137d0
patch bug (SeverianVoid's comment on 5245c7a
)
2022-10-26 10:12:46 +03:00
AngelBottomless
7207e3bf49
remove duplicate keys and lowercase
2022-10-26 09:17:01 +03:00
AngelBottomless
de096d0ce7
Weight initialization and More activation func
...
add weight init
add weight init option in create_hypernetwork
fstringify hypernet info
save weight initialization info for further debugging
fill bias with zero for He/Xavier
initialize LayerNorm with Normal
fix loading weight_init
2022-10-26 09:17:01 +03:00
AngelBottomless
e9a410b535
check length for variance
2022-10-24 09:07:39 +03:00
AngelBottomless
0d2e1dac40
convert deque -> list
...
I don't feel this being efficient
2022-10-24 09:07:39 +03:00
AngelBottomless
348f89c8d4
statistics for pbar
2022-10-24 09:07:39 +03:00
AngelBottomless
40b56c9289
cleanup some code
2022-10-24 09:07:39 +03:00
AngelBottomless
b297cc3324
Hypernetworks - fix KeyError in statistics caching
...
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
2022-10-24 09:07:39 +03:00
DepFA
1fbfc052eb
Update hypernetwork.py
2022-10-23 08:34:33 +03:00
AngelBottomless
48dbf99e84
Allow tracking real-time loss
...
Someone had 6000 images in their dataset, and it was shown as 0, which was confusing.
This will allow tracking real time dataset-average loss for registered objects.
2022-10-22 22:24:19 +03:00
AngelBottomless
24694e5983
Update hypernetwork.py
2022-10-22 20:25:32 +03:00
discus0434
6a4fa73a38
small fix
2022-10-22 13:44:39 +00:00
discus0434
97749b7c7d
Merge branch 'AUTOMATIC1111:master' into master
2022-10-22 22:00:59 +09:00
discus0434
7912acef72
small fix
2022-10-22 13:00:44 +00:00
discus0434
fccba4729d
add an option to avoid dying relu
2022-10-22 12:02:41 +00:00
AUTOMATIC
7fd90128eb
added a guard for hypernet training that will stop early if weights are getting no gradients
2022-10-22 14:48:43 +03:00
discus0434
dcb45dfecf
Merge branch 'master' of upstream
2022-10-22 11:14:46 +00:00
discus0434
0e8ca8e7af
add dropout
2022-10-22 11:07:00 +00:00
timntorres
272fa527bb
Remove unused variable.
2022-10-21 16:52:24 +03:00
timntorres
19818f023c
Match hypernet name with filename in all cases.
2022-10-21 16:52:24 +03:00
timntorres
51e3dc9cca
Sanitize hypernet name input.
2022-10-21 16:52:24 +03:00
AUTOMATIC
03a1e288c4
turns out LayerNorm also has weight and bias and needs to be pre-multiplied and trained for hypernets
2022-10-21 10:13:24 +03:00
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text
2022-10-21 09:57:55 +03:00
timntorres
4ff274e1e3
Revise comments.
2022-10-21 09:55:00 +03:00
timntorres
5245c7a493
Issue #2921-Give PNG info to Hypernet previews.
2022-10-21 09:55:00 +03:00
AUTOMATIC
c23f666dba
a more strict check for activation type and a more reasonable check for type of layer in hypernets
2022-10-21 09:47:43 +03:00
aria1th
f89829ec3a
Revert "fix bugs and optimizations"
...
This reverts commit 108be15500
.
2022-10-21 01:37:11 +09:00
AngelBottomless
108be15500
fix bugs and optimizations
2022-10-21 01:00:41 +09:00
AngelBottomless
a71e021236
only linear
2022-10-20 23:48:52 +09:00
AngelBottomless
d8acd34f66
generalized some functions and option for ignoring first layer
2022-10-20 23:43:03 +09:00