Commit Graph

270 Commits

Author SHA1 Message Date
Jaret Burkett
a767b82b60 Fixed issue with new logger when ooming 2025-12-25 16:57:34 +00:00
Jaret Burkett
87edca1b2b Added initial support to initiate lora training from an existing lora 2025-12-22 12:49:15 -07:00
Jaret Burkett
ba00eea7d9 Add loss graph to the ui 2025-12-18 10:08:59 -07:00
Jaret Burkett
0d8a33dc16 Offload ARA with the layer if doing layer offloading. Add support to offload the LoRA. Still needs optimizer support 2025-10-21 06:03:27 -06:00
Jaret Burkett
76ce757e0c Added initial support for layer offloading wit Wan 2.2 14B models. 2025-10-20 14:54:30 -06:00
Jaret Burkett
1bc6dee127 Change auto_memory to be layer_offloading and allow you to set the amount to unload 2025-10-10 13:12:32 -06:00
Jaret Burkett
4e5707854f Initial support for RamTorch. Still a WIP 2025-10-05 13:03:26 -06:00
Jaret Burkett
3086a58e5b git status 2025-10-01 14:12:17 -06:00
Jaret Burkett
b07b88c46b Allow trigger when caching text embeddings since it is now passed to dataset 2025-09-30 16:58:35 -06:00
Jaret Burkett
3b1f7b0948 Allow user to set the attention backend. Add method to recomver from the occasional OOM if it is a rare event. Still exit if it ooms 3 times in a row. 2025-09-27 08:56:15 -06:00
Jaret Burkett
454be0958a Initial support for qwen image edit plus 2025-09-24 11:39:10 -06:00
Jaret Burkett
390e21bec6 Integrate dataset level trigger words and allow them to be cached. Default to global trigger if it is set. 2025-09-18 03:29:18 -06:00
Jaret Burkett
fc5b41666a Switch order to save first, then sample. 2025-08-27 11:07:03 -06:00
Jaret Burkett
f48d21caee Upgrade a LoRA rank if the new one is larger so users can increase the rank on an exiting training job and continue training at a higher rank. 2025-08-24 13:40:25 -06:00
Jaret Burkett
b3e666daf4 Fix issue with wan22 14b where timesteps were generated not in the current boundary. 2025-08-16 21:16:48 -06:00
Jaret Burkett
8ea2cf00f6 Added training to the ui. Still testing, but everything seems to be working. 2025-08-16 05:51:37 -06:00
Jaret Burkett
3413fa537f Wan22 14b training is working, still need tons of testing and some bug fixes 2025-08-14 13:03:27 -06:00
Jaret Burkett
69ee99b6e1 Fix issue with base model version 2025-08-12 09:26:48 -06:00
Jaret Burkett
77b10d884d Add support for training with an accuracy recovery adapter with qwen image 2025-08-12 08:21:36 -06:00
Jaret Burkett
bb6db3d635 Added support for caching text embeddings. This is just initial support and will probably fail for some models. Still needs to be ompimized 2025-08-07 10:27:55 -06:00
Jaret Burkett
1755e58dd9 Update generation script to handle latest models. 2025-08-05 08:55:16 -06:00
Jaret Burkett
3f518d9951 Add sharpening before losses with a split loss on vae training 2025-07-27 15:11:56 -06:00
Jaret Burkett
0d89c44624 Bug fixes on vae trainer. Allow to target params for vae training. 2025-07-26 09:20:22 -06:00
Jaret Burkett
c5eb763342 Improvements to VAE trainer. Allow CLIP loss. 2025-07-24 06:50:56 -06:00
Jaret Burkett
e25d2feddf Use scale shift in vae latent space for vae trainer 2025-07-17 08:14:07 -06:00
Jaret Burkett
f500b9f240 Add ability to do more advanced sample prompt objects to prepart for a UI rework on control images and other things. 2025-07-17 07:13:35 -06:00
Jaret Burkett
3916e67455 Scale target vae latent before targeting it 2025-07-17 07:12:21 -06:00
Jaret Burkett
e5ed450dc7 Allow finetuning tiny autoencoder in vae trainer 2025-07-16 07:13:30 -06:00
Jaret Burkett
2e84b3d5b1 Update VAE trainer to handle fixed latent target. Also minor bug fixes and improvements 2025-07-12 16:55:15 -06:00
Jaret Burkett
ba1274d99e Added a guidance burning loss. Modified DFE to work with new model. Bug fixes 2025-06-23 08:38:27 -06:00
Jaret Burkett
989ebfaa11 Added a basic torch profiler that can be used in config during development to find some obvious issues. 2025-06-17 13:03:39 -06:00
Jaret Burkett
1cc663a664 Performance optimizations for pre processing the batch 2025-06-17 07:37:41 -06:00
Jaret Burkett
1c2b7298dd More work on mean flow loss. Moved it to an adapter. Still not functioning properly though. 2025-06-16 07:17:35 -06:00
Jaret Burkett
eefa93f16e Various code to support experiments. 2025-06-09 11:19:21 -06:00
Jaret Burkett
adc31ec77d Small updates and bug fixes for various things 2025-06-03 20:08:35 -06:00
Jaret Burkett
b6d25fcd10 Improvements to vae trainer. Adjust denoise prediction of DFE v3 2025-05-30 12:06:47 -06:00
Jaret Burkett
34f4c14cd6 Work on vae trainer 2025-05-28 07:42:48 -06:00
Jaret Burkett
7045a01375 Fixed issue saving optimizer in some instances. 2025-05-21 02:27:55 -06:00
Jaret Burkett
e5181d23cd Added some experimental training techniques. Ignore for now. Still in testing. 2025-05-21 02:19:54 -06:00
Jaret Burkett
43cb5603ad Added chroma model to the ui. Added logic to easily pull latest, use local, or use a specific version of chroma. Allow ustom name or path in the ui for custom models 2025-05-07 12:06:30 -06:00
Jaret Burkett
2b4c525489 Reworked automagic optimizer and did more testing. Starting to really like it. Working well. 2025-04-28 08:01:10 -06:00
Jaret Burkett
88b3fbae37 Various experiments and minor bug fixes for edge cases 2025-04-25 13:44:38 -06:00
Jaret Burkett
77001ee77f Upodate model tag on loras 2025-04-19 10:41:27 -06:00
Jaret Burkett
d455e76c4f Cleanup 2025-04-18 11:44:49 -06:00
Jaret Burkett
bd2de5b74e Remove leco submodule 2025-04-18 10:08:09 -06:00
Jaret Burkett
615b0d0e94 Added initial support for training i2v adapter WIP 2025-04-09 08:06:29 -06:00
Jaret Burkett
a8680c75eb Added initial support for finetuning wan i2v WIP 2025-04-07 20:34:38 -06:00
Jaret Burkett
6c8b5ab606 Added some more useful error handeling and logging 2025-04-07 08:01:37 -06:00
Jaret Burkett
860d892214 Pixel shuffle adapter. Some bug fixes thrown in 2025-03-29 21:15:01 -06:00
Jaret Burkett
45be82d5d6 Handle inpainting training for control_lora adapter 2025-03-24 13:17:47 -06:00