840 Commits (ca9d300a804fd1bfc67b3de9200c2d09b78899d0)

Author SHA1 Message Date
comfyanonymous ca9d300a80 Better estimation for memory usage during audio VAE encoding/decoding. 8 months ago
comfyanonymous 746a0410d4 Fix VAEEncode with taesd3. 8 months ago
comfyanonymous 04e8798c37 Improvements to the TAESD3 implementation. 8 months ago
Dr.Lt.Data df7db0e027
support TAESD3 (#3738) 8 months ago
comfyanonymous bb1969cab7 Initial support for the stable audio open model. 8 months ago
comfyanonymous 1281f933c1 Small optimization. 8 months ago
comfyanonymous f2e844e054 Optimize some unneeded if conditions in the sampling code. 8 months ago
comfyanonymous 0ec513d877 Add a --force-channels-last to inference models in channel last mode. 8 months ago
comfyanonymous 0e06b370db Print key names for easier debugging. 8 months ago
Simon Lui 5eb98f0092
Exempt IPEX from non_blocking previews fixing segmentation faults. (#3708) 9 months ago
comfyanonymous ac151ac169 Support SD3 diffusers lora. 9 months ago
comfyanonymous 37a08a41b3 Support setting weight offsets in weight patcher. 9 months ago
comfyanonymous 605e64f6d3 Fix lowvram issue. 9 months ago
comfyanonymous 1ddf512fdc Don't auto convert clip and vae weights to fp16 when saving checkpoint. 9 months ago
comfyanonymous 694e0b48e0 SD3 better memory usage estimation. 9 months ago
comfyanonymous 69c8d6d8a6 Single and dual clip loader nodes support SD3.
You can use the CLIPLoader to use the t5xxl only or the DualCLIPLoader to
use CLIP-L and CLIP-G only for sd3.
9 months ago
comfyanonymous 0e49211a11 Load the SD3 T5xxl model in the same dtype stored in the checkpoint. 9 months ago
comfyanonymous 5889b7ca0a Support multiple text encoder configurations on SD3. 9 months ago
comfyanonymous 9424522ead Reuse code. 9 months ago
Dango233 73ce178021
Remove redundancy in mmdit.py (#3685) 9 months ago
comfyanonymous a82fae2375 Fix bug with cosxl edit model. 9 months ago
comfyanonymous 8c4a9befa7 SD3 Support. 9 months ago
comfyanonymous a5e6a632f9 Support sampling non 2D latents. 9 months ago
comfyanonymous 742d5720d1 Support zeroing out text embeddings with the attention mask. 9 months ago
comfyanonymous 6cd8ffc465 Reshape the empty latent image to the right amount of channels if needed. 9 months ago
comfyanonymous 56333d4850 Use the end token for the text encoder attention mask. 9 months ago
comfyanonymous 104fcea0c8 Add function to get the list of currently loaded models. 9 months ago
comfyanonymous b1fd26fe9e pytorch xpu should be flash or mem efficient attention? 9 months ago
comfyanonymous 809cc85a8e Remove useless code. 9 months ago
comfyanonymous b249862080 Add an annoying print to a function I want to remove. 9 months ago
comfyanonymous bf3e334d46 Disable non_blocking when --deterministic or directml. 9 months ago
JettHu b26da2245f
Fix UnetParams annotation typo (#3589) 9 months ago
comfyanonymous 0920e0e5fe Remove some unused imports. 9 months ago
comfyanonymous ffc4b7c30e Fix DORA strength.
This is a different version of #3298 with more correct behavior.
9 months ago
comfyanonymous efa5a711b2 Reduce memory usage when applying DORA: #3557 9 months ago
comfyanonymous 6c23854f54 Fix OSX latent2rgb previews. 9 months ago
Chenlei Hu 7718ada4ed
Add type annotation UnetWrapperFunction (#3531)
* Add type annotation UnetWrapperFunction

* nit

* Add types.py
9 months ago
comfyanonymous 8508df2569 Work around black image bug on Mac 14.5 by forcing attention upcasting. 9 months ago
comfyanonymous 83d969e397 Disable xformers when tracing model. 9 months ago
comfyanonymous 1900e5119f Fix potential issue. 9 months ago
comfyanonymous 09e069ae6c Log the pytorch version. 9 months ago
comfyanonymous 11a2ad5110 Fix controlnet not upcasting on models that have it enabled. 9 months ago
comfyanonymous 0bdc2b15c7 Cleanup. 9 months ago
comfyanonymous 98f828fad9 Remove unnecessary code. 9 months ago
comfyanonymous 19300655dd Don't automatically switch to lowvram mode on GPUs with low memory. 9 months ago
comfyanonymous 46daf0a9a7 Add debug options to force on and off attention upcasting. 9 months ago
comfyanonymous 2d41642716 Fix lowvram dora issue. 10 months ago
comfyanonymous ec6f16adb6 Fix SAG. 10 months ago
comfyanonymous bb4940d837 Only enable attention upcasting on models that actually need it. 10 months ago
comfyanonymous b0ab31d06c Refactor attention upcasting code part 1. 10 months ago