Commit Graph

195 Commits

Author SHA1 Message Date
Jaret Burkett
bb6db3d635 Added support for caching text embeddings. This is just initial support and will probably fail for some models. Still needs to be ompimized 2025-08-07 10:27:55 -06:00
Jaret Burkett
f500b9f240 Add ability to do more advanced sample prompt objects to prepart for a UI rework on control images and other things. 2025-07-17 07:13:35 -06:00
Jaret Burkett
ba1274d99e Added a guidance burning loss. Modified DFE to work with new model. Bug fixes 2025-06-23 08:38:27 -06:00
Jaret Burkett
989ebfaa11 Added a basic torch profiler that can be used in config during development to find some obvious issues. 2025-06-17 13:03:39 -06:00
Jaret Burkett
1cc663a664 Performance optimizations for pre processing the batch 2025-06-17 07:37:41 -06:00
Jaret Burkett
1c2b7298dd More work on mean flow loss. Moved it to an adapter. Still not functioning properly though. 2025-06-16 07:17:35 -06:00
Jaret Burkett
eefa93f16e Various code to support experiments. 2025-06-09 11:19:21 -06:00
Jaret Burkett
7045a01375 Fixed issue saving optimizer in some instances. 2025-05-21 02:27:55 -06:00
Jaret Burkett
e5181d23cd Added some experimental training techniques. Ignore for now. Still in testing. 2025-05-21 02:19:54 -06:00
Jaret Burkett
2b4c525489 Reworked automagic optimizer and did more testing. Starting to really like it. Working well. 2025-04-28 08:01:10 -06:00
Jaret Burkett
88b3fbae37 Various experiments and minor bug fixes for edge cases 2025-04-25 13:44:38 -06:00
Jaret Burkett
77001ee77f Upodate model tag on loras 2025-04-19 10:41:27 -06:00
Jaret Burkett
d455e76c4f Cleanup 2025-04-18 11:44:49 -06:00
Jaret Burkett
615b0d0e94 Added initial support for training i2v adapter WIP 2025-04-09 08:06:29 -06:00
Jaret Burkett
a8680c75eb Added initial support for finetuning wan i2v WIP 2025-04-07 20:34:38 -06:00
Jaret Burkett
6c8b5ab606 Added some more useful error handeling and logging 2025-04-07 08:01:37 -06:00
Jaret Burkett
860d892214 Pixel shuffle adapter. Some bug fixes thrown in 2025-03-29 21:15:01 -06:00
Jaret Burkett
45be82d5d6 Handle inpainting training for control_lora adapter 2025-03-24 13:17:47 -06:00
Jaret Burkett
3812957bc9 Added ability to train control loras. Other important bug fixes thrown in 2025-03-14 18:03:00 -06:00
Jaret Burkett
e6739f7eb2 Convert wan lora weights on save to be something comfy can handle 2025-03-08 12:55:11 -07:00
Jaret Burkett
391cf80fea Added training for Wan2.1. Not finalized, wait. 2025-03-07 13:53:44 -07:00
Jaret Burkett
6f6fb90812 Added cogview4. Loss still needs work. 2025-03-04 18:43:52 -07:00
Jaret Burkett
acc79956aa WIP create new class to add new models more easily 2025-03-01 13:49:02 -07:00
Jaret Burkett
3e49337a58 Set step to the last step saved at when exiting 2025-02-23 13:21:22 -07:00
Jaret Burkett
60f848a877 Send more data when loading the model to the ui 2025-02-23 12:49:54 -07:00
Jaret Burkett
ad87f72384 Start, stop, monitor jobs from ui working. 2025-02-21 09:49:28 -07:00
Jaret Burkett
adcf884c0f Built out the ui trainer plugin with db comminication 2025-02-21 05:53:35 -07:00
Jaret Burkett
8450aca10e Fixed missed merge conflice and locked diffusers version 2025-02-12 09:40:02 -07:00
Jaret Burkett
0b8a32def7 merged in lumina2 branch 2025-02-12 09:33:03 -07:00
Jaret Burkett
787bb37e76 Small fixed for DFE, polar guidance, and other things 2025-02-12 09:27:44 -07:00
Jaret Burkett
10aa7e9d5e Fixed some breaking changes with diffusers gradient checkpointing. 2025-02-10 09:35:31 -07:00
Jaret Burkett
d138f07365 Imitial lumina3 support 2025-02-08 10:59:53 -07:00
Jaret Burkett
c6d8eedb94 Added ability to use consistent noise for each image in a dataset by hashing the path and using that as a seed. 2025-02-08 07:13:48 -07:00
Jaret Burkett
216ab164ce Experimental features and bug fixes 2025-02-04 13:36:34 -07:00
Jaret Burkett
e6180d1e1d Bug fixes 2025-01-31 13:23:01 -07:00
Jaret Burkett
15a57bc89f Add new version of DFE. Kitchen sink 2025-01-31 11:42:27 -07:00
Jaret Burkett
34a1c6947a Added flux_shift as timestep type 2025-01-27 07:35:00 -07:00
Jaret Burkett
5e663746b8 Working multi gpu training. Still need a lot of tweaks and testing. 2025-01-25 16:46:20 -07:00
Jaret Burkett
89dd041b97 Added ability to pair samples with a closer noise with optimal_noise_pairing_samples 2025-01-21 18:30:10 -07:00
Jaret Burkett
4723f23c0d Added ability to split up flux across gpus (experimental). Changed the way timestep scheduling works to prep for more specific schedules. 2024-12-31 07:06:55 -07:00
Jaret Burkett
8ef07a9c36 Added training for an experimental decoratgor embedding. Allow for turning off guidance embedding on flux (for unreleased model). Various bug fixes and modifications 2024-12-15 08:59:27 -07:00
Jaret Burkett
f213996aa5 Fixed saving and displaying for automagic 2024-11-29 08:00:22 -07:00
Jaret Burkett
67c2e44edb Added support for training flux redux adapters 2024-11-21 20:01:52 -07:00
Jaret Burkett
96d418bb95 Added support for full finetuning flux with randomized param activation. Examples coming soon 2024-11-21 13:05:32 -07:00
Jaret Burkett
58f9d01c2b Added adafactor implementation that handles stochastic rounding of update and accumulation 2024-10-30 05:25:57 -06:00
Jaret Burkett
4747716867 Fixed issue with adapters not providing gradients with new grad activator 2024-10-29 14:22:10 -06:00
Jaret Burkett
22cd40d7b9 Improvements for full tuning flux. Added debugging launch config for vscode 2024-10-29 04:54:08 -06:00
Jaret Burkett
3400882a80 Added preliminary support for SD3.5-large lora training 2024-10-22 12:21:36 -06:00
Jaret Burkett
9f94c7b61e Added experimental param multiplier to the ema module 2024-10-22 09:25:52 -06:00
Jaret Burkett
ab22674980 Allow for a default caption file in the folder. Minor bug fixes. 2024-10-10 07:31:33 -06:00