70 Commits (935ae153e154813ace36db4c4656a5e96f403eba)

Author SHA1 Message Date
comfyanonymous 2ca8f6e23d Make the stochastic fp8 rounding reproducible. 6 months ago
comfyanonymous c6812947e9 Fix potential memory leak. 6 months ago
comfyanonymous 7df42b9a23 Fix dora. 6 months ago
comfyanonymous c26ca27207 Move calculate function to comfy.lora 6 months ago
comfyanonymous 7c6bb84016 Code cleanups. 6 months ago
comfyanonymous 4d341b78e8 Bug fixes. 6 months ago
comfyanonymous 6138f92084 Use better dtype for the lowvram lora system. 6 months ago
comfyanonymous be0726c1ed Remove duplication. 6 months ago
comfyanonymous 20ace7c853 Code cleanup. 6 months ago
comfyanonymous bb222ceddb Fix loras having a weak effect when applied on fp8. 6 months ago
comfyanonymous cd5017c1c9 calculate_weight function to use a different dtype. 6 months ago
comfyanonymous 83f343146a Fix potential lowvram issue. 6 months ago
comfyanonymous 74e124f4d7 Fix some issues with TE being in lowvram mode. 6 months ago
comfyanonymous 5d43e75e5b Fix some issues with the model sometimes not getting patched. 6 months ago
comfyanonymous 517f4a94e4 Fix some lora loading slowdowns. 6 months ago
comfyanonymous 52a471c5c7 Change name of log. 7 months ago
comfyanonymous 1de69fe4d5 Fix some issues with inference slowing down. 7 months ago
comfyanonymous 6678d5cf65 Fix regression. 7 months ago
comfyanonymous a3cc326748 Better fix for lowvram issue. 7 months ago
comfyanonymous 5acdadc9f3 Fix issue with some lowvram weights. 7 months ago
comfyanonymous 1e11d2d1f5 Better prints. 7 months ago
comfyanonymous 08f92d55e9 Partial model shift support. 7 months ago
comfyanonymous cb7c4b4be3 Workaround for lora OOM on lowvram mode. 7 months ago
comfyanonymous b334605a66 Fix OOMs happening in some cases.
A cloned model patcher sometimes reported a model was loaded on a device
when it wasn't.
7 months ago
Extraltodeus f1a01c2c7e
Add sampler_pre_cfg_function (#3979)
* Update samplers.py

* Update model_patcher.py
8 months ago
comfyanonymous 73ca780019 Add SamplerEulerCFG++ node.
This node should match the DDIM implementation of CFG++ when "regular" is
selected.

"alternative" is a slightly different take on CFG++
8 months ago
comfyanonymous 028a583bef Fix issue with full diffusers SD3 loras. 8 months ago
comfyanonymous 0e06b370db Print key names for easier debugging. 8 months ago
comfyanonymous 37a08a41b3 Support setting weight offsets in weight patcher. 8 months ago
comfyanonymous 809cc85a8e Remove useless code. 9 months ago
comfyanonymous ffc4b7c30e Fix DORA strength.
This is a different version of #3298 with more correct behavior.
9 months ago
comfyanonymous efa5a711b2 Reduce memory usage when applying DORA: #3557 9 months ago
Chenlei Hu 7718ada4ed
Add type annotation UnetWrapperFunction (#3531)
* Add type annotation UnetWrapperFunction

* nit

* Add types.py
9 months ago
comfyanonymous 2d41642716 Fix lowvram dora issue. 9 months ago
comfyanonymous fa6dd7e5bb Fix lowvram issue with saving checkpoints.
The previous fix didn't cover the case where the model was loaded in
lowvram mode right before.
10 months ago
comfyanonymous e1489ad257 Fix issue with lowvram mode breaking model saving. 10 months ago
comfyanonymous 719fb2c81d Add basic PAG node. 10 months ago
kk-89 38ed2da2dd
Fix typo in lowvram patcher (#3209) 11 months ago
comfyanonymous 41ed7e85ea Fix object_patches_backup not being the same object across clones. 11 months ago
comfyanonymous 1f4fc9ea0c Fix issue with get_model_object on patched model. 11 months ago
comfyanonymous 1a0486bb96 Fix model needing to be loaded on GPU to generate the sigmas. 11 months ago
comfyanonymous ae77590b4e dora_scale support for lora file. 11 months ago
comfyanonymous c18a203a8a Don't unload model weights for non weight patches. 11 months ago
comfyanonymous db8b59ecff Lower memory usage for loras in lowvram mode at the cost of perf. 12 months ago
comfyanonymous 65397ce601 Replace prints with logging and add --verbose argument. 12 months ago
comfyanonymous 12c1080ebc Simplify differential diffusion code. 12 months ago
comfyanonymous 1abf8374ec utils.set_attr can now be used to set any attribute.
The old set_attr has been renamed to set_attr_param.
12 months ago
comfyanonymous 9f71e4b62d Let model patches patch sub objects. 12 months ago
comfyanonymous ef4f6037cb Fix model patches not working in custom sampling scheduler nodes. 1 year ago
comfyanonymous 12e822c6c8 Use function to calculate model size in model patcher. 1 year ago