AUTOMATIC
3ec7b705c7
suggestions and fixes from the PR
2023-05-10 21:21:32 +03:00
AUTOMATIC
028d3f6425
ruff auto fixes
2023-05-10 11:05:02 +03:00
AUTOMATIC
f741a98bac
imports cleanup for ruff
2023-05-10 08:43:42 +03:00
AUTOMATIC
40ff6db532
extra networks UI
...
rework of hypernets: rather than via settings, hypernets are added directly to prompt as <hypernet:name:weight>
2023-01-21 08:36:07 +03:00
aria1th
a4a5475cfa
Variable dropout rate
...
Implements variable dropout rate from #4549
Fixes hypernetwork multiplier being able to modified during training, also fixes user-errors by setting multiplier value to lower values for training.
Changes function name to match torch.nn.module standard
Fixes RNG reset issue when generating previews by restoring RNG state
2023-01-10 14:56:57 +09:00
Vladimir Mandic
5f1dfbbc95
implement train api
2022-12-24 18:02:22 -05:00
AngelBottomless
20194fd975
We have duplicate linear now
2022-10-30 20:40:59 +09:00
AngelBottomless
f361e804eb
Re enable linear
2022-10-29 08:36:50 +09:00
AngelBottomless
462e6ba667
Disable unavailable or duplicate options
2022-10-27 15:40:24 +09:00
AngelBottomless
de096d0ce7
Weight initialization and More activation func
...
add weight init
add weight init option in create_hypernetwork
fstringify hypernet info
save weight initialization info for further debugging
fill bias with zero for He/Xavier
initialize LayerNorm with Normal
fix loading weight_init
2022-10-26 09:17:01 +03:00
discus0434
dcb45dfecf
Merge branch 'master' of upstream
2022-10-22 11:14:46 +00:00
discus0434
0e8ca8e7af
add dropout
2022-10-22 11:07:00 +00:00
timntorres
51e3dc9cca
Sanitize hypernet name input.
2022-10-21 16:52:24 +03:00
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text
2022-10-21 09:57:55 +03:00
discus0434
6b38c2c19c
Merge branch 'AUTOMATIC1111:master' into master
2022-10-20 18:51:12 +09:00
AUTOMATIC
930b4c64f7
allow float sizes for hypernet's layer_structure
2022-10-20 08:18:02 +03:00
discus0434
6f98e89486
update
2022-10-20 00:10:45 +00:00
DepFA
166be3919b
allow overwrite old hn
2022-10-20 00:09:40 +01:00
discus0434
3770b8d2fa
enable to write layer structure of hn himself
2022-10-19 15:28:42 +00:00
discus0434
42fbda83bb
layer options moves into create hnet ui
2022-10-19 14:30:33 +00:00
AUTOMATIC
6be32b31d1
reports that training with medvram is possible.
2022-10-11 23:07:09 +03:00
AUTOMATIC
d4ea5f4d86
add an option to unload models during hypernetwork training to save VRAM
2022-10-11 19:03:08 +03:00
AUTOMATIC
6d09b8d1df
produce error when training with medvram/lowvram enabled
2022-10-11 18:33:57 +03:00
AUTOMATIC
d682444ecc
add option to select hypernetwork modules when creating
2022-10-11 18:04:47 +03:00
AUTOMATIC
b0583be088
more renames
2022-10-11 15:54:34 +03:00
AUTOMATIC
873efeed49
rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have
2022-10-11 15:51:30 +03:00