Xiangxi Guo (Ryan)
f0d5d0111f
Avoid torch compile graphbreak for older pytorch versions ( #9344 )
...
Turns out torch.compile has some gaps in context manager decorator
syntax support. I've sent patches to fix that in PyTorch, but it won't
be available for all the folks running older versions of PyTorch, hence
this trivial patch.
2025-08-14 23:41:37 -04:00
comfyanonymous
ad19a069f6
Make SLG nodes work on Qwen Image model. ( #9345 )
2025-08-14 23:16:01 -04:00
Dr.Lt.Data
91555acf2c
Merge branch 'master' into dr-support-pip-cm
2025-08-14 12:01:56 +09:00
Jedrzej Kosinski
e4f7ea105f
Added context window support to core sampling code ( #9238 )
...
* Added initial support for basic context windows - in progress
* Add prepare_sampling wrapper for context window to more accurately estimate latent memory requirements, fixed merging wrappers/callbacks dicts in prepare_model_patcher
* Made context windows compatible with different dimensions; works for WAN, but results are bad
* Fix comfy.patcher_extension.merge_nested_dicts calls in prepare_model_patcher in sampler_helpers.py
* Considering adding some callbacks to context window code to allow extensions of behavior without the need to rewrite code
* Made dim slicing cleaner
* Add Wan Context WIndows node for testing
* Made context schedule and fuse method functions be stored on the handler instead of needing to be registered in core code to be found
* Moved some code around between node_context_windows.py and context_windows.py
* Change manual context window nodes names/ids
* Added callbacks to IndexListContexHandler
* Adjusted default values for context_length and context_overlap, made schema.inputs definition for WAN Context Windows less annoying
* Make get_resized_cond more robust for various dim sizes
* Fix typo
* Another small fix
2025-08-13 21:33:05 -04:00
Simon Lui
c991a5da65
Fix XPU iGPU regressions ( #9322 )
...
* Change bf16 check and switch non-blocking to off default with option to force to regain speed on certain classes of iGPUs and refactor xpu check.
* Turn non_blocking off by default for xpu.
* Update README.md for Intel GPUs.
2025-08-13 19:13:35 -04:00
comfyanonymous
9df8792d4b
Make last PR not crash comfy on old pytorch. ( #9324 )
2025-08-13 15:12:41 -04:00
contentis
3da5a07510
SDPA backend priority ( #9299 )
2025-08-13 14:53:27 -04:00
Dr.Lt.Data
d7777dc83a
Merge branch 'master' into dr-support-pip-cm
2025-08-14 02:36:19 +09:00
comfyanonymous
560d38f34c
Wan2.2 fun control support. ( #9292 )
2025-08-12 23:26:33 -04:00
Dr.Lt.Data
264116dc4d
Merge branch 'master' into dr-support-pip-cm
2025-08-12 10:13:31 +09:00
PsychoLogicAu
2208aa616d
Support SimpleTuner lycoris lora for Qwen-Image ( #9280 )
2025-08-11 16:56:16 -04:00
Dr.Lt.Data
37277e4188
Merge branch 'master' into dr-support-pip-cm
2025-08-10 20:57:20 +09:00
comfyanonymous
5828607ccf
Not sure if AMD actually support fp16 acc but it doesn't crash. ( #9258 )
2025-08-09 12:49:25 -04:00
Dr.Lt.Data
106510197a
Merge branch 'master' into dr-support-pip-cm
2025-08-08 23:48:53 +09:00
comfyanonymous
735bb4bdb1
Users report gfx1201 is buggy on flux with pytorch attention. ( #9244 )
2025-08-08 04:21:00 -04:00
Dr.Lt.Data
2fe58571e2
Merge branch 'master' into dr-support-pip-cm
2025-08-07 07:45:14 +09:00
flybirdxx
4c3e57b0ae
Fixed an issue where qwenLora could not be loaded properly. ( #9208 )
2025-08-06 13:23:11 -04:00
comfyanonymous
d044a24398
Fix default shift and any latent size for qwen image model. ( #9186 )
2025-08-05 06:12:27 -04:00
Dr.Lt.Data
46209599ff
Merge branch 'master' into dr-support-pip-cm
2025-08-05 12:24:25 +09:00
comfyanonymous
c012400240
Initial support for qwen image model. ( #9179 )
2025-08-04 22:53:25 -04:00
Dr.Lt.Data
02317a1f71
Merge branch 'master' into dr-support-pip-cm
2025-08-05 06:21:27 +09:00
comfyanonymous
03895dea7c
Fix another issue with the PR. ( #9170 )
2025-08-04 04:33:04 -04:00
comfyanonymous
84f9759424
Add some warnings and prevent crash when cond devices don't match. ( #9169 )
2025-08-04 04:20:12 -04:00
comfyanonymous
7991341e89
Various fixes for broken things from earlier PR. ( #9168 )
2025-08-04 04:02:40 -04:00
comfyanonymous
140ffc7fdc
Fix broken controlnet from last PR. ( #9167 )
2025-08-04 03:28:12 -04:00
comfyanonymous
182f90b5ec
Lower cond vram use by casting at the same time as device transfer. ( #9159 )
2025-08-04 03:11:53 -04:00
Dr.Lt.Data
ac7e83448e
Merge branch 'master' into dr-support-pip-cm
2025-08-04 07:25:20 +09:00
comfyanonymous
aebac22193
Cleanup. ( #9160 )
2025-08-03 07:08:11 -04:00
comfyanonymous
13aaa66ec2
Make sure context is on the right device. ( #9154 )
2025-08-02 15:09:23 -04:00
comfyanonymous
5f582a9757
Make sure all the conds are on the right device. ( #9151 )
2025-08-02 15:00:13 -04:00
comfyanonymous
1e638a140b
Tiny wan vae optimizations. ( #9136 )
2025-08-01 05:25:38 -04:00
Dr.Lt.Data
5582e2a0f3
Merge branch 'master' into dr-support-pip-cm
2025-07-31 12:33:38 +09:00
chaObserv
61b08d4ba6
Replace manual x * sigmoid(x) with torch silu in VAE nonlinearity ( #9057 )
2025-07-30 19:25:56 -04:00
comfyanonymous
da9dab7edd
Small wan camera memory optimization. ( #9111 )
2025-07-30 05:55:26 -04:00
Dr.Lt.Data
3c8196a170
Merge branch 'master' into dr-support-pip-cm
2025-07-30 12:14:34 +09:00
comfyanonymous
dca6bdd4fa
Make wan2.2 5B i2v take a lot less memory. ( #9102 )
2025-07-29 19:44:18 -04:00
Dr.Lt.Data
62c08e4659
Merge branch 'master' into dr-support-pip-cm
2025-07-29 23:44:44 +09:00
comfyanonymous
7d593baf91
Extra reserved vram on large cards on windows. ( #9093 )
2025-07-29 04:07:45 -04:00
Dr.Lt.Data
ac7bde1d03
Merge branch 'master' into dr-support-pip-cm
2025-07-29 12:13:25 +09:00
comfyanonymous
c60dc4177c
Remove unecessary clones in the wan2.2 VAE. ( #9083 )
2025-07-28 14:48:19 -04:00
comfyanonymous
a88788dce6
Wan 2.2 support. ( #9080 )
2025-07-28 08:00:23 -04:00
Dr.Lt.Data
6909638a42
Merge branch 'master' into dr-support-pip-cm
2025-07-27 15:01:02 +09:00
comfyanonymous
0621d73a9c
Remove useless code. ( #9059 )
2025-07-26 04:44:19 -04:00
Dr.Lt.Data
d0625d7f7c
Merge branch 'master' into dr-support-pip-cm
2025-07-26 09:35:21 +09:00
comfyanonymous
e6e5d33b35
Remove useless code. ( #9041 )
...
This is only needed on old pytorch 2.0 and older.
2025-07-25 04:58:28 -04:00
Dr.Lt.Data
6b19857c93
Merge branch 'master' into dr-support-pip-cm
2025-07-25 12:21:17 +09:00
Eugene Fairley
4293e4da21
Add WAN ATI support ( #8874 )
...
* Add WAN ATI support
* Fixes
* Fix length
* Remove extra functions
* Fix
* Fix
* Ruff fix
* Remove torch.no_grad
* Add batch trajectory logic
* Scale inputs before and after motion patch
* Batch image/trajectory
* Ruff fix
* Clean up
2025-07-24 20:59:19 -04:00
comfyanonymous
69cb57b342
Print xpu device name. ( #9035 )
2025-07-24 15:06:25 -04:00
honglyua
0ccc88b03f
Support Iluvatar CoreX ( #8585 )
...
* Support Iluvatar CoreX
Co-authored-by: mingjiang.li <mingjiang.li@iluvatar.com >
2025-07-24 13:57:36 -04:00
Dr.Lt.Data
4e904305ce
Merge branch 'dr-support-pip-cm'
2025-07-24 12:22:50 +09:00