152 Commits (cd5017c1c9b3a7b0fec892e80290a32616bbff38)

Author SHA1 Message Date
comfyanonymous 33fb282d5c Fix issue. 6 months ago
comfyanonymous 3b71f84b50 ONNX tracing fixes. 7 months ago
comfyanonymous 25853d0be8 Use common function for casting weights to input. 7 months ago
comfyanonymous 79040635da Remove unnecessary code. 7 months ago
comfyanonymous 66d35c07ce Improve artifacts on hydit, auraflow and SD3 on specific resolutions.
This breaks seeds for resolutions that are not a multiple of 16 in pixel
resolution by using circular padding instead of reflection padding but
should lower the amount of artifacts when doing img2img at those
resolutions.
7 months ago
comfyanonymous 10b43ceea5 Remove duplicate code. 7 months ago
comfyanonymous f8f7568d03 Basic SD3 controlnet implementation.
Still missing the node to properly use it.
8 months ago
comfyanonymous bb1969cab7 Initial support for the stable audio open model. 8 months ago
comfyanonymous 1281f933c1 Small optimization. 8 months ago
comfyanonymous 605e64f6d3 Fix lowvram issue. 9 months ago
Dango233 73ce178021
Remove redundancy in mmdit.py (#3685) 9 months ago
comfyanonymous 8c4a9befa7 SD3 Support. 9 months ago
comfyanonymous 0920e0e5fe Remove some unused imports. 9 months ago
comfyanonymous 8508df2569 Work around black image bug on Mac 14.5 by forcing attention upcasting. 9 months ago
comfyanonymous 83d969e397 Disable xformers when tracing model. 9 months ago
comfyanonymous 1900e5119f Fix potential issue. 9 months ago
comfyanonymous 0bdc2b15c7 Cleanup. 9 months ago
comfyanonymous 98f828fad9 Remove unnecessary code. 9 months ago
comfyanonymous 46daf0a9a7 Add debug options to force on and off attention upcasting. 9 months ago
comfyanonymous ec6f16adb6 Fix SAG. 10 months ago
comfyanonymous bb4940d837 Only enable attention upcasting on models that actually need it. 10 months ago
comfyanonymous b0ab31d06c Refactor attention upcasting code part 1. 10 months ago
comfyanonymous 2aed53c4ac Workaround xformers bug. 10 months ago
comfyanonymous 2a813c3b09 Switch some more prints to logging. 12 months ago
comfyanonymous cb7c3a2921 Allow image_only_indicator to be None. 1 year ago
comfyanonymous b3e97fc714 Koala 700M and 1B support.
Use the UNET Loader node to load the unet file to use them.
1 year ago
comfyanonymous 6bcf57ff10 Fix attention masks properly for multiple batches. 1 year ago
comfyanonymous f8706546f3 Fix attention mask batch size in some attention functions. 1 year ago
comfyanonymous 3b9969c1c5 Properly fix attention masks in CLIP with batches. 1 year ago
comfyanonymous c661a8b118 Don't use numpy for calculating sigmas. 1 year ago
comfyanonymous 89507f8adf Remove some unused imports. 1 year ago
comfyanonymous 2395ae740a Make unclip more deterministic.
Pass a seed argument note that this might make old unclip images different.
1 year ago
comfyanonymous 6a7bc35db8 Use basic attention implementation for small inputs on old pytorch. 1 year ago
comfyanonymous c6951548cf Update optimized_attention_for_device function for new functions that
support masked attention.
1 year ago
comfyanonymous aaa9017302 Add attention mask support to sub quad attention. 1 year ago
comfyanonymous 0c2c9fbdfa Support attention mask in split attention. 1 year ago
comfyanonymous 3ad0191bfb Implement attention mask on xformers. 1 year ago
comfyanonymous 8c6493578b Implement noise augmentation for SD 4X upscale model. 1 year ago
comfyanonymous 79f73a4b33 Remove useless code. 1 year ago
comfyanonymous 61b3f15f8f Fix lowvram mode not working with unCLIP and Revision code. 1 year ago
comfyanonymous d0165d819a Fix SVD lowvram mode. 1 year ago
comfyanonymous 261bcbb0d9 A few missing comfy ops in the VAE. 1 year ago
comfyanonymous a5056cfb1f Remove useless code. 1 year ago
comfyanonymous 77755ab8db Refactor comfy.ops
comfy.ops -> comfy.ops.disable_weight_init

This should make it more clear what they actually do.

Some unused code has also been removed.
1 year ago
comfyanonymous fbdb14d4c4 Cleaner CLIP text encoder implementation.
Use a simple CLIP model implementation instead of the one from
transformers.

This will allow some interesting things that would too hackish to implement
using the transformers implementation.
1 year ago
comfyanonymous 1bbd65ab30 Missed this one. 1 year ago
comfyanonymous 31b0f6f3d8 UNET weights can now be stored in fp8.
--fp8_e4m3fn-unet and --fp8_e5m2-unet are the two different formats
supported by pytorch.
1 year ago
comfyanonymous af365e4dd1 All the unet ops with weights are now handled by comfy.ops 1 year ago
comfyanonymous 39e75862b2 Fix regression from last commit. 1 year ago
comfyanonymous 50dc39d6ec Clean up the extra_options dict for the transformer patches.
Now everything in transformer_options gets put in extra_options.
1 year ago