20 Commits (bfb52de866ce659c281a6c243c8485109e4d8b8b)

Author SHA1 Message Date
Alexander Brown ce9ac2fe05
Fix clip_g/clip_l mixup (#4168) 7 months ago
comfyanonymous 5f98de7697 Load flux t5 in fp8 if weights are in fp8. 7 months ago
comfyanonymous 1589b58d3e Basic Flux Schnell and Flux Dev model implementation. 7 months ago
comfyanonymous c24f897352 Fix to get fp8 working on T5 base. 7 months ago
comfyanonymous a5991a7aa6 Fix hunyuan dit text encoder weights always being in fp32. 7 months ago
comfyanonymous 2c038ccef0 Lower CLIP memory usage by a bit. 7 months ago
comfyanonymous b85216a3c0 Lower T5 memory usage by a few hundred MB. 7 months ago
comfyanonymous 82cae45d44 Fix potential issue with non clip text embeddings. 7 months ago
comfyanonymous 4ba7fa0244 Refactor: Move sd2_clip.py to text_encoders folder. 7 months ago
comfyanonymous cf4418b806 Don't treat Bert model like CLIP.
Bert can accept up to 512 tokens so any prompt with more than 77 should
just be passed to it as is instead of splitting it up like CLIP.
7 months ago
comfyanonymous a9ac56fc0d Own BertModel implementation that works with lowvram. 7 months ago
comfyanonymous a5f4292f9f
Basic hunyuan dit implementation. (#4102)
* Let tokenizers return weights to be stored in the saved checkpoint.

* Basic hunyuan dit implementation.

* Fix some resolutions not working.

* Support hydit checkpoint save.

* Init with right dtype.

* Switch to optimized attention in pooler.

* Fix black images on hunyuan dit.
7 months ago
comfyanonymous f87810cd3e Let tokenizers return weights to be stored in the saved checkpoint. 7 months ago
comfyanonymous 10c919f4c7 Make it possible to load tokenizer data from checkpoints. 7 months ago
comfyanonymous 0a4c49c57c Support MT5. 7 months ago
comfyanonymous 88ed893034 Allow SPieceTokenizer to load model from a byte string. 7 months ago
comfyanonymous 14764aa2e2 Rename LLAMATokenizer to SPieceTokenizer. 7 months ago
comfyanonymous 1305fb294c Refactor: Move some code to the comfy/text_encoders folder. 7 months ago
comfyanonymous 29c2e26724 Better tokenizing code for AuraFlow. 8 months ago
comfyanonymous 9f291d75b3 AuraFlow model implementation. 8 months ago