Files
ComfyUI/comfy
rattus 123a7874a9 ops: Fix vanilla-fp8 loaded lora quality (#12390)
This was missing the stochastic rounding required for fp8 downcast
to be consistent with model_patcher.patch_weight_to_device.

Missed in testing as I spend too much time with quantized tensors
and overlooked the simpler ones.
2026-02-10 13:38:28 -05:00
..
2026-02-06 20:14:52 -05:00
2024-06-27 18:43:11 -04:00
2025-01-24 06:15:54 -05:00
2025-07-06 07:07:39 -04:00
2026-01-01 22:06:14 -05:00