mirror of
https://github.com/comfyanonymous/ComfyUI.git
synced 2026-01-30 04:59:51 +00:00
* flux: math: Use _addcmul to avoid expensive VRAM intermediate The rope process can be the VRAM peak and this intermediate for the addition result before releasing the original can OOM. addcmul_ it. * wan: Delete the self attention before cross attention This saves VRAM when the cross attention and FFN are in play as the VRAM peak.