Commit Graph

1149 Commits

Author SHA1 Message Date
Milly
0609ce06c0 Removed duplicate definition model_path 2022-10-09 12:46:07 +03:00
Brendan Byrd
a65a45272e Don't change the seed initially if "Keep -1 for seeds" is checked
Fixes #1049
2022-10-09 12:43:56 +03:00
Jesse Williams
d74c38108f Confirm that options are valid before starting
When using the 'Sampler' or 'Checkpoint' options, if one of the entered
names has a typo, an error will only be thrown once the `draw_xy_grid`
loop reaches that name. This can waste a lot of time for large grids
with a typo near the end of a list, since the script needs to start over
and re-generate any earlier images to finish making the grid.

Also fixing typo in variable name in `draw_xy_grid`.
2022-10-09 12:39:18 +03:00
AUTOMATIC
6f6798ddab prevent a possible code execution error (thanks, RyotaK) 2022-10-09 12:33:37 +03:00
AUTOMATIC
0241d811d2 Revert "Fix for Prompts_from_file showing extra textbox."
This reverts commit e2930f9821c197da94e208b5ae73711002844efc.
2022-10-09 12:04:44 +03:00
AUTOMATIC
ab4fe4f44c hide filenames for save button by default 2022-10-09 11:59:41 +03:00
Tony Beeman
cbf6dad02d Handle case where on_show returns the wrong number of arguments 2022-10-09 11:16:38 +03:00
Tony Beeman
86cb16886f Pull Request Code Review Fixes 2022-10-09 11:16:38 +03:00
Tony Beeman
e2930f9821 Fix for Prompts_from_file showing extra textbox. 2022-10-09 11:16:38 +03:00
Nicolas Noullet
1ffeb42d38 Fix typo 2022-10-09 11:10:13 +03:00
frostydad
ef93acdc73 remove line break 2022-10-09 11:09:17 +03:00
frostydad
03e570886f Fix incorrect sampler name in output 2022-10-09 11:09:17 +03:00
Fampai
122d42687b Fix VRAM Issue by only loading in hypernetwork when selected in settings 2022-10-09 11:08:11 +03:00
AUTOMATIC1111
e00b4df7c6
Merge pull request #1752 from Greendayle/dev/deepdanbooru
Added DeepDanbooru interrogator
2022-10-09 10:52:21 +03:00
aoirusann
14192c5b20 Support Download for txt files. 2022-10-09 10:49:11 +03:00
aoirusann
5ab7e88d9b Add Download & Download as zip 2022-10-09 10:49:11 +03:00
AUTOMATIC
4e569fd888 fixed incorrect message about loading config; thanks anon! 2022-10-09 10:31:47 +03:00
AUTOMATIC
c77c89cc83 make main model loading and model merger use the same code 2022-10-09 10:23:31 +03:00
AUTOMATIC
050a6a798c support loading .yaml config with same name as model
support EMA weights in processing (????)
2022-10-08 23:26:48 +03:00
Aidan Holland
432782163a chore: Fix typos 2022-10-08 22:42:30 +03:00
Edouard Leurent
610a7f4e14 Break after finding the local directory of stable diffusion
Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../.

Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
2022-10-08 22:35:04 +03:00
AUTOMATIC
3b2141c5fb add 'Ignore last layers of CLIP model' option as a parameter to the infotext 2022-10-08 22:21:15 +03:00
AUTOMATIC
e6e42f98df make --force-enable-xformers work without needing --xformers 2022-10-08 22:12:23 +03:00
Fampai
1371d7608b Added ability to ignore last n layers in FrozenCLIPEmbedder 2022-10-08 22:10:37 +03:00
DepFA
b458fa48fe Update ui.py 2022-10-08 20:38:35 +03:00
DepFA
15c4278f1a TI preprocess wording
I had to check the code to work out what splitting was 🤷🏿
2022-10-08 20:38:35 +03:00
Greendayle
0ec80f0125
Merge branch 'master' into dev/deepdanbooru 2022-10-08 18:28:22 +02:00
AUTOMATIC
3061cdb7b6 add --force-enable-xformers option and also add messages to console regarding cross attention optimizations 2022-10-08 19:22:15 +03:00
AUTOMATIC
f9c5da1592 add fallback for xformers_attnblock_forward 2022-10-08 19:05:19 +03:00
Greendayle
01f8cb4447 made deepdanbooru optional, added to readme, automatic download of deepbooru model 2022-10-08 18:02:56 +02:00
Artem Zagidulin
a5550f0213 alternate prompt 2022-10-08 18:12:19 +03:00
DepFA
34acad1628 Add GZipMiddleware to root demo 2022-10-08 18:03:16 +03:00
C43H66N12O12S2
cc0258aea7 check for ampere without destroying the optimizations. again. 2022-10-08 17:54:16 +03:00
C43H66N12O12S2
017b6b8744 check for ampere 2022-10-08 17:54:16 +03:00
C43H66N12O12S2
7e639cd498 check for 3.10 2022-10-08 17:54:16 +03:00
Greendayle
5329d0aba0 Merge branch 'master' into dev/deepdanbooru 2022-10-08 16:30:28 +02:00
AUTOMATIC
cfc33f99d4 why did you do this 2022-10-08 17:29:06 +03:00
Greendayle
2e8ba0fa47 fix conflicts 2022-10-08 16:27:48 +02:00
Milly
4f33289d0f Fixed typo 2022-10-08 17:15:30 +03:00
AUTOMATIC
27032c47df restore old opt_split_attention/disable_opt_split_attention logic 2022-10-08 17:10:05 +03:00
AUTOMATIC
dc1117233e simplify xfrmers options: --xformers to enable and that's it 2022-10-08 17:02:18 +03:00
AUTOMATIC
7ff1170a2e emergency fix for xformers (continue + shared) 2022-10-08 16:33:39 +03:00
AUTOMATIC1111
48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
2022-10-08 16:29:59 +03:00
C43H66N12O12S2
970de9ee68
Update sd_hijack.py 2022-10-08 16:29:43 +03:00
C43H66N12O12S2
7ffea15078
Update requirements_versions.txt 2022-10-08 16:24:06 +03:00
C43H66N12O12S2
ca5f0f149c
Update launch.py 2022-10-08 16:22:38 +03:00
C43H66N12O12S2
69d0053583
update sd_hijack_opt to respect new env variables 2022-10-08 16:21:40 +03:00
C43H66N12O12S2
ddfa9a9786
add xformers_available shared variable 2022-10-08 16:20:41 +03:00
C43H66N12O12S2
26b459a379
default to split attention if cuda is available and xformers is not 2022-10-08 16:20:04 +03:00
C43H66N12O12S2
d0e85873ac
check for OS and env variable 2022-10-08 16:13:26 +03:00