146 Commits

Author SHA1 Message Date
Jaret Burkett
1930c3edea Fix naming with wan i2v new keys in lora 2025-07-14 07:34:01 -06:00
Jaret Burkett
755f0e207c Fix issue with wan i2v scaling. Adjust aggressive loader to be compatable with updated diffusers. 2025-07-12 16:56:27 -06:00
Jaret Burkett
8d9c47316a Work on mean flow. Minor bug fixes. Omnigen improvements 2025-06-26 13:46:20 -06:00
Jaret Burkett
03bc431279 Fixed an issue training lumina 2 2025-06-24 10:29:47 -06:00
Jaret Burkett
ba1274d99e Added a guidance burning loss. Modified DFE to work with new model. Bug fixes 2025-06-23 08:38:27 -06:00
Jaret Burkett
8602470952 Updated diffusion feature extractor 2025-06-19 15:36:10 -06:00
Jaret Burkett
1c2b7298dd More work on mean flow loss. Moved it to an adapter. Still not functioning properly though. 2025-06-16 07:17:35 -06:00
Jaret Burkett
c0314ba325 Fixed some issues with training mean flow algo. Still testing WIP 2025-06-16 07:14:59 -06:00
Jaret Burkett
fc83eb7691 WIP on mean flow loss. Still a WIP. 2025-06-12 08:00:51 -06:00
Jaret Burkett
97e101522c Increase ema feedback amount. Normalize the dfe 4 image embeds 2025-06-10 08:01:13 -06:00
Jaret Burkett
eefa93f16e Various code to support experiments. 2025-06-09 11:19:21 -06:00
Jaret Burkett
b6d25fcd10 Improvements to vae trainer. Adjust denoise prediction of DFE v3 2025-05-30 12:06:47 -06:00
Jaret Burkett
4f896c0d8a Fixed issue where sampling fails if doing a full finetune for some models 2025-05-17 19:37:55 +00:00
Jaret Burkett
d9700bdb99 Added initial support for f-lite model 2025-05-01 11:15:18 -06:00
Jaret Burkett
2b4c525489 Reworked automagic optimizer and did more testing. Starting to really like it. Working well. 2025-04-28 08:01:10 -06:00
Jaret Burkett
88b3fbae37 Various experiments and minor bug fixes for edge cases 2025-04-25 13:44:38 -06:00
Jaret Burkett
12e3095d8a Fixed issue with saving base model version 2025-04-19 14:34:01 -06:00
Jaret Burkett
77001ee77f Upodate model tag on loras 2025-04-19 10:41:27 -06:00
Jaret Burkett
d455e76c4f Cleanup 2025-04-18 11:44:49 -06:00
Jaret Burkett
bfe29e2151 Removed all submodules. Submodule free now, yay. 2025-04-18 10:39:15 -06:00
Jaret Burkett
5f312cd46b Remove ip adapter submodule 2025-04-18 09:59:42 -06:00
Jaret Burkett
f80cf99f40 Hidream is training, but has a memory leak 2025-04-13 23:28:18 +00:00
Jaret Burkett
ca3ce0f34c Make it easier to designate lora blocks for new models. Improve i2v adapter speed. Fix issue with i2v adapter where cached torch tensor was wrong range. 2025-04-13 13:49:13 -06:00
Jaret Burkett
6fb44db6a0 Finished up first frame for i2v adapter 2025-04-12 17:13:04 -06:00
Jaret Burkett
4a43589666 Use a shuffled embedding as unconditional for i2v adapter 2025-04-11 10:44:43 -06:00
Jaret Burkett
059155174a Added mask diffirential mask dialation for flex2. Handle video for the i2v adapter 2025-04-10 11:50:01 -06:00
Jaret Burkett
615b0d0e94 Added initial support for training i2v adapter WIP 2025-04-09 08:06:29 -06:00
Jaret Burkett
a8680c75eb Added initial support for finetuning wan i2v WIP 2025-04-07 20:34:38 -06:00
Jaret Burkett
6c8b5ab606 Added some more useful error handeling and logging 2025-04-07 08:01:37 -06:00
Jaret Burkett
5ea19b6292 small bug fixes 2025-03-30 20:09:40 -06:00
Jaret Burkett
c083a0e5ea Allow DFE to not have a VAE 2025-03-30 09:23:01 -06:00
Jaret Burkett
860d892214 Pixel shuffle adapter. Some bug fixes thrown in 2025-03-29 21:15:01 -06:00
Jaret Burkett
5365200da1 Added ability to add models to finetune as plugins. Also added flux2 new arch via that method. 2025-03-27 16:07:00 -06:00
Jaret Burkett
45be82d5d6 Handle inpainting training for control_lora adapter 2025-03-24 13:17:47 -06:00
Jaret Burkett
f10937e6da Handle multi control inputs for control lora training 2025-03-23 07:37:08 -06:00
Jaret Burkett
1ad58c5816 Changed control lora to only have new weights and leave other input weights alone for more flexability of using multiple ones together. 2025-03-22 10:24:52 -06:00
Jaret Burkett
f5aa4232fa Added ability to quantize with torchao 2025-03-20 16:28:54 -06:00
Jaret Burkett
b829983b16 Added ability to load video datasets and train with them 2025-03-19 09:54:26 -06:00
Jaret Burkett
604e76d34d Fix issue with full finetuning wan 2025-03-17 09:17:40 -06:00
Jaret Burkett
3812957bc9 Added ability to train control loras. Other important bug fixes thrown in 2025-03-14 18:03:00 -06:00
Jaret Burkett
391329dbdc Fix issue with device placement on te 2025-03-13 20:48:12 -06:00
Jaret Burkett
31e057d9a3 Fixed issue with device placement in some scenereos when doing low vram on wan 2025-03-13 10:30:27 -06:00
Jaret Burkett
e6739f7eb2 Convert wan lora weights on save to be something comfy can handle 2025-03-08 12:55:11 -07:00
Jaret Burkett
7e37918fbc Double tap module casting as it doesent seem to happen every time. 2025-03-07 22:15:24 -07:00
Jaret Burkett
4d88f8f218 Fixed cuda error when not all tensors have been moved to the correct device. 2025-03-07 22:04:35 -07:00
Jaret Burkett
25341c4613 Got wan 14b training to work on 24GB card. 2025-03-07 17:04:10 -07:00
Jaret Burkett
391cf80fea Added training for Wan2.1. Not finalized, wait. 2025-03-07 13:53:44 -07:00
Jaret Burkett
763128ea42 Note about cogview 2025-03-05 14:46:11 -07:00
Jaret Burkett
4fe33f51c1 Fix issue with picking layers for quantization, adjust layers fo better quantization of cogview4 2025-03-05 13:44:40 -07:00
Jaret Burkett
aa44828c0c WIP more work on cogview4 2025-03-05 09:43:00 -07:00