2526 Commits (ce37c11164ebc452592f3b0e67fb63c8c16374c0)
 

Author SHA1 Message Date
Vladimir Semyonov ce37c11164
add DS_Store to gitignore (#4324) 6 months ago
Alex "mcmonkey" Goodwin b5c3906b38
Automatically link the Comfy CI page on PRs (#4326)
also use_prior_commit so it doesn't get a janked merge commit instead of the real one
6 months ago
comfyanonymous 5d43e75e5b Fix some issues with the model sometimes not getting patched. 6 months ago
comfyanonymous 517f4a94e4 Fix some lora loading slowdowns. 6 months ago
comfyanonymous 52a471c5c7 Change name of log. 6 months ago
comfyanonymous ad76574cb8 Fix some potential issues with the previous commits. 6 months ago
comfyanonymous 9acfe4df41 Support loading directly to vram with CLIPLoader node. 6 months ago
comfyanonymous 9829b013ea Fix mistake in last commit. 6 months ago
comfyanonymous 5c69cde037 Load TE model straight to vram if certain conditions are met. 6 months ago
comfyanonymous e9589d6d92 Add a way to set model dtype and ops from load_checkpoint_guess_config. 7 months ago
comfyanonymous 0d82a798a5 Remove the ckpt_path from load_state_dict_guess_config. 7 months ago
ljleb 925fff26fd
alternative to `load_checkpoint_guess_config` that accepts a loaded state dict (#4249)
* make alternative fn

* add back ckpt path as 2nd argument?
7 months ago
comfyanonymous 75b9b55b22 Fix issues with #4302 and support loading diffusers format flux. 7 months ago
Jaret Burkett 1765f1c60c
FLUX: Added full diffusers mapping for FLUX.1 schnell and dev. Adds full LoRA support from diffusers LoRAs. (#4302) 7 months ago
comfyanonymous 1de69fe4d5 Fix some issues with inference slowing down. 7 months ago
comfyanonymous ae197f651b Speed up hunyuan dit inference a bit. 7 months ago
comfyanonymous 1b5b8ca81a Fix regression. 7 months ago
comfyanonymous 6678d5cf65 Fix regression. 7 months ago
TTPlanetPig e172564eea
Update controlnet.py to fix the default controlnet weight as constant (#4285) 7 months ago
comfyanonymous a3cc326748 Better fix for lowvram issue. 7 months ago
comfyanonymous 86a97e91fc Fix controlnet regression. 7 months ago
comfyanonymous 5acdadc9f3 Fix issue with some lowvram weights. 7 months ago
comfyanonymous 55ad9d5f8c Fix regression. 7 months ago
comfyanonymous a9f04edc58 Implement text encoder part of HunyuanDiT loras. 7 months ago
comfyanonymous a475ec2300 Cleanup HunyuanDit controlnets.
Use the: ControlNetApply SD3 and HunyuanDiT node.
7 months ago
来新璐 06eb9fb426
feat: add support for HunYuanDit ControlNet (#4245)
* add support for HunYuanDit ControlNet

* fix hunyuandit controlnet

* fix typo in hunyuandit controlnet

* fix typo in hunyuandit controlnet

* fix code format style

* add control_weight support for HunyuanDit Controlnet

* use control_weights in HunyuanDit Controlnet

* fix typo
7 months ago
comfyanonymous 413322645e Raw torch is faster than einops? 7 months ago
comfyanonymous 11200de970 Cleaner code. 7 months ago
comfyanonymous 037c38eb0f Try to improve inference speed on some machines. 7 months ago
comfyanonymous 1e11d2d1f5 Better prints. 7 months ago
Alex "mcmonkey" Goodwin 65ea6be38f
PullRequest CI Run: use pull_request_target to allow the CI Dashboard to work (#4277)
'_target' allows secrets to pass through, and we're just using the secret that allows uploading to the dashboard and are manually vetting PRs before running this workflow anyway
7 months ago
Alex "mcmonkey" Goodwin 5df6f57b5d
minor fix on copypasta action name (#4276)
my bad sorry
7 months ago
Alex "mcmonkey" Goodwin 6588bfdef9
add GitHub workflow for CI tests of PRs (#4275)
When the 'Run-CI-Test' label is added to a PR, it will be tested by the CI, on a small matrix of stable versions.
7 months ago
Alex "mcmonkey" Goodwin 50ed2879ef
Add full CI test matrix GitHub Workflow (#4274)
automatically runs a matrix of full GPU-enabled tests on all new commits to the ComfyUI master branch
7 months ago
comfyanonymous 66d4233210 Fix. 7 months ago
comfyanonymous 591010b7ef Support diffusers text attention flux loras. 7 months ago
comfyanonymous 08f92d55e9 Partial model shift support. 7 months ago
comfyanonymous 8115d8cce9 Add Flux fp16 support hack. 7 months ago
comfyanonymous 6969fc9ba4 Make supported_dtypes a priority list. 7 months ago
comfyanonymous cb7c4b4be3 Workaround for lora OOM on lowvram mode. 7 months ago
comfyanonymous 1208863eca Fix "Comfy" lora keys.
They are in this format now:
diffusion_model.full.model.key.name.lora_up.weight
7 months ago
comfyanonymous e1c528196e Fix bundled embed. 7 months ago
comfyanonymous 17030fd4c0 Support for "Comfy" lora format.
The keys are just: model.full.model.key.name.lora_up.weight

It is supported by all comfyui supported models.

Now people can just convert loras to this format instead of having to ask
for me to implement them.
7 months ago
comfyanonymous c19dcd362f Controlnet code refactor. 7 months ago
comfyanonymous 1c08bf35b4 Support format for embeddings bundled in loras. 7 months ago
PhilWun 2a02546e20
Add type hints to folder_paths.py (#4191)
* add type hints to folder_paths.py

* replace deprecated standard collections type hints

* fix type error when using Python 3.8
7 months ago
comfyanonymous b334605a66 Fix OOMs happening in some cases.
A cloned model patcher sometimes reported a model was loaded on a device
when it wasn't.
7 months ago
comfyanonymous de17a9755e Unload all models if there's an OOM error. 7 months ago
comfyanonymous c14ac98fed Unload models and load them back in lowvram mode no free vram. 7 months ago
Robin Huang 2894511893
Clone taesd with depth of 1 to reduce download size. (#4232) 7 months ago