Commit Graph

15 Commits

Author SHA1 Message Date
Jaret Burkett
88b3fbae37 Various experiments and minor bug fixes for edge cases 2025-04-25 13:44:38 -06:00
Jaret Burkett
12e3095d8a Fixed issue with saving base model version 2025-04-19 14:34:01 -06:00
Jaret Burkett
77001ee77f Upodate model tag on loras 2025-04-19 10:41:27 -06:00
Jaret Burkett
f80cf99f40 Hidream is training, but has a memory leak 2025-04-13 23:28:18 +00:00
Jaret Burkett
ca3ce0f34c Make it easier to designate lora blocks for new models. Improve i2v adapter speed. Fix issue with i2v adapter where cached torch tensor was wrong range. 2025-04-13 13:49:13 -06:00
Jaret Burkett
a8680c75eb Added initial support for finetuning wan i2v WIP 2025-04-07 20:34:38 -06:00
Jaret Burkett
5ea19b6292 small bug fixes 2025-03-30 20:09:40 -06:00
Jaret Burkett
860d892214 Pixel shuffle adapter. Some bug fixes thrown in 2025-03-29 21:15:01 -06:00
Jaret Burkett
5365200da1 Added ability to add models to finetune as plugins. Also added flux2 new arch via that method. 2025-03-27 16:07:00 -06:00
Jaret Burkett
f5aa4232fa Added ability to quantize with torchao 2025-03-20 16:28:54 -06:00
Jaret Burkett
604e76d34d Fix issue with full finetuning wan 2025-03-17 09:17:40 -06:00
Jaret Burkett
e6739f7eb2 Convert wan lora weights on save to be something comfy can handle 2025-03-08 12:55:11 -07:00
Jaret Burkett
391cf80fea Added training for Wan2.1. Not finalized, wait. 2025-03-07 13:53:44 -07:00
Jaret Burkett
6f6fb90812 Added cogview4. Loss still needs work. 2025-03-04 18:43:52 -07:00
Jaret Burkett
acc79956aa WIP create new class to add new models more easily 2025-03-01 13:49:02 -07:00