comfyanonymous
afe732bef9
Hunyuan dit can now accept longer prompts.
7 months ago
comfyanonymous
a5f4292f9f
Basic hunyuan dit implementation. ( #4102 )
...
* Let tokenizers return weights to be stored in the saved checkpoint.
* Basic hunyuan dit implementation.
* Fix some resolutions not working.
* Support hydit checkpoint save.
* Init with right dtype.
* Switch to optimized attention in pooler.
* Fix black images on hunyuan dit.
7 months ago
comfyanonymous
10b43ceea5
Remove duplicate code.
7 months ago
comfyanonymous
9f291d75b3
AuraFlow model implementation.
8 months ago
comfyanonymous
f8f7568d03
Basic SD3 controlnet implementation.
...
Still missing the node to properly use it.
8 months ago
comfyanonymous
66aaa14001
Controlnet refactor.
8 months ago
comfyanonymous
8ddc151a4c
Squash depreciation warning on new pytorch.
8 months ago
comfyanonymous
bb1969cab7
Initial support for the stable audio open model.
8 months ago
comfyanonymous
1281f933c1
Small optimization.
8 months ago
comfyanonymous
605e64f6d3
Fix lowvram issue.
9 months ago
Dango233
73ce178021
Remove redundancy in mmdit.py ( #3685 )
9 months ago
comfyanonymous
8c4a9befa7
SD3 Support.
9 months ago
comfyanonymous
0920e0e5fe
Remove some unused imports.
9 months ago
comfyanonymous
8508df2569
Work around black image bug on Mac 14.5 by forcing attention upcasting.
9 months ago
comfyanonymous
83d969e397
Disable xformers when tracing model.
9 months ago
comfyanonymous
1900e5119f
Fix potential issue.
9 months ago
comfyanonymous
0bdc2b15c7
Cleanup.
9 months ago
comfyanonymous
98f828fad9
Remove unnecessary code.
9 months ago
comfyanonymous
46daf0a9a7
Add debug options to force on and off attention upcasting.
9 months ago
comfyanonymous
ec6f16adb6
Fix SAG.
10 months ago
comfyanonymous
bb4940d837
Only enable attention upcasting on models that actually need it.
10 months ago
comfyanonymous
b0ab31d06c
Refactor attention upcasting code part 1.
10 months ago
comfyanonymous
2aed53c4ac
Workaround xformers bug.
10 months ago
comfyanonymous
d7897fff2c
Move cascade scale factor from stage_a to latent_formats.py
12 months ago
comfyanonymous
2a813c3b09
Switch some more prints to logging.
12 months ago
comfyanonymous
5f60ee246e
Support loading the sr cascade controlnet.
12 months ago
comfyanonymous
03e6e81629
Set upscale algorithm to bilinear for stable cascade controlnet.
12 months ago
comfyanonymous
03e83bb5d0
Support stable cascade canny controlnet.
12 months ago
comfyanonymous
cb7c3a2921
Allow image_only_indicator to be None.
1 year ago
comfyanonymous
b3e97fc714
Koala 700M and 1B support.
...
Use the UNET Loader node to load the unet file to use them.
1 year ago
comfyanonymous
e93cdd0ad0
Remove print.
1 year ago
comfyanonymous
a7b5eaa7e3
Forgot to commit this.
1 year ago
comfyanonymous
6bcf57ff10
Fix attention masks properly for multiple batches.
1 year ago
comfyanonymous
11e3221f1f
fp8 weight support for Stable Cascade.
1 year ago
comfyanonymous
f8706546f3
Fix attention mask batch size in some attention functions.
1 year ago
comfyanonymous
3b9969c1c5
Properly fix attention masks in CLIP with batches.
1 year ago
comfyanonymous
805c36ac9c
Make Stable Cascade work on old pytorch 2.0
1 year ago
comfyanonymous
667c92814e
Stable Cascade Stage B.
1 year ago
comfyanonymous
f83109f09b
Stable Cascade Stage C.
1 year ago
comfyanonymous
5e06baf112
Stable Cascade Stage A.
1 year ago
comfyanonymous
c661a8b118
Don't use numpy for calculating sigmas.
1 year ago
comfyanonymous
89507f8adf
Remove some unused imports.
1 year ago
comfyanonymous
2395ae740a
Make unclip more deterministic.
...
Pass a seed argument note that this might make old unclip images different.
1 year ago
comfyanonymous
6a7bc35db8
Use basic attention implementation for small inputs on old pytorch.
1 year ago
comfyanonymous
c6951548cf
Update optimized_attention_for_device function for new functions that
...
support masked attention.
1 year ago
comfyanonymous
aaa9017302
Add attention mask support to sub quad attention.
1 year ago
comfyanonymous
0c2c9fbdfa
Support attention mask in split attention.
1 year ago
comfyanonymous
3ad0191bfb
Implement attention mask on xformers.
1 year ago
comfyanonymous
8c6493578b
Implement noise augmentation for SD 4X upscale model.
1 year ago
comfyanonymous
79f73a4b33
Remove useless code.
1 year ago