228 Commits

Author SHA1 Message Date
Jaret Burkett
611969ec1f Allow control image for omnigen training and sampling 2025-07-09 13:54:55 -06:00
Jaret Burkett
bbb57de6ec Speed up omnigen TE loading 2025-07-05 09:32:00 -06:00
Jaret Burkett
5906a76666 Fixed issue with flux kontext forcing generation image sizes 2025-06-29 05:38:20 -06:00
Jaret Burkett
57a81bc0db Update base model version for kontext meta 2025-06-28 14:48:36 -06:00
Jaret Burkett
01a3c8a9b1 Fix device issue 2025-06-26 19:14:25 -06:00
Jaret Burkett
4f91cb7148 Fix issue with gradient checkpointing and flux kontext 2025-06-26 19:03:12 -06:00
Jaret Burkett
446b0b6989 Remove revision for kontext 2025-06-26 16:46:58 -06:00
Jaret Burkett
60ef2f1df7 Added support for FLUX.1-Kontext-dev 2025-06-26 15:24:37 -06:00
Jaret Burkett
8d9c47316a Work on mean flow. Minor bug fixes. Omnigen improvements 2025-06-26 13:46:20 -06:00
Jaret Burkett
84c6edca7e Merge branch 'main' into dev 2025-06-25 14:10:25 -06:00
Jaret Burkett
19ea8ecc38 Added support for finetuning OmniGen2. 2025-06-25 13:58:16 -06:00
Jaret Burkett
18513ec866 Merged in from main 2025-06-24 10:56:54 -06:00
Jaret Burkett
f3eb1dff42 Add a config flag to trigger fast image size db builder. Add config flag to set unconditional prompt for guidance loss 2025-06-24 08:51:29 -06:00
Jaret Burkett
ba1274d99e Added a guidance burning loss. Modified DFE to work with new model. Bug fixes 2025-06-23 08:38:27 -06:00
Jaret Burkett
8602470952 Updated diffusion feature extractor 2025-06-19 15:36:10 -06:00
Jaret Burkett
1cc663a664 Performance optimizations for pre processing the batch 2025-06-17 07:37:41 -06:00
Jaret Burkett
1c2b7298dd More work on mean flow loss. Moved it to an adapter. Still not functioning properly though. 2025-06-16 07:17:35 -06:00
Jaret Burkett
c0314ba325 Fixed some issues with training mean flow algo. Still testing WIP 2025-06-16 07:14:59 -06:00
Jaret Burkett
cbf04b8d53 Fixed some issues with training mean flow algo. Still testing WIP 2025-06-14 12:24:00 -06:00
Jaret Burkett
fc83eb7691 WIP on mean flow loss. Still a WIP. 2025-06-12 08:00:51 -06:00
Jaret Burkett
eefa93f16e Various code to support experiments. 2025-06-09 11:19:21 -06:00
Jaret Burkett
22cdfadab6 Added new timestep weighing strategy 2025-06-04 01:16:02 -06:00
Jaret Burkett
adc31ec77d Small updates and bug fixes for various things 2025-06-03 20:08:35 -06:00
Jaret Burkett
ffaf2f154a Fix issue with the way chroma handled gradient checkpointing. 2025-05-28 08:41:47 -06:00
Jaret Burkett
79bb9be92b Fix issue with saving chroma full finetune. 2025-05-28 07:42:30 -06:00
Jaret Burkett
79499fa795 Allow fine tuning pruned versions of chroma. Allow flash attention 2 for chroma if it is installed. 2025-05-21 07:02:50 -06:00
Jaret Burkett
e5181d23cd Added some experimental training techniques. Ignore for now. Still in testing. 2025-05-21 02:19:54 -06:00
Jaret Burkett
6174ba474e Fixed issue with chroma sampling 2025-05-10 18:30:23 +00:00
Jaret Burkett
43cb5603ad Added chroma model to the ui. Added logic to easily pull latest, use local, or use a specific version of chroma. Allow ustom name or path in the ui for custom models 2025-05-07 12:06:30 -06:00
Jaret Burkett
d9700bdb99 Added initial support for f-lite model 2025-05-01 11:15:18 -06:00
Jaret Burkett
5890e67a46 Various bug fixes 2025-04-29 09:30:33 -06:00
Jaret Burkett
add83df5cc Fixed issue with training hidream when batch size is larger than 1 2025-04-21 17:26:29 +00:00
Jaret Burkett
77001ee77f Upodate model tag on loras 2025-04-19 10:41:27 -06:00
Jaret Burkett
0f99fce004 Adjust hidream lora names to work with comfy 2025-04-16 09:24:23 -06:00
Jaret Burkett
524bd2edfc Make flash attn optional. Handle larger batch sizes. 2025-04-14 14:34:46 +00:00
Jaret Burkett
3a5ea2c742 Remove some moe stuff for finetuning. Drastically reduces vram usage 2025-04-14 00:57:34 +00:00
Jaret Burkett
f80cf99f40 Hidream is training, but has a memory leak 2025-04-13 23:28:18 +00:00
Jaret Burkett
594e166ca3 Initial support for hidream. Still a WIP 2025-04-13 13:50:11 -06:00
Jaret Burkett
6fb44db6a0 Finished up first frame for i2v adapter 2025-04-12 17:13:04 -06:00
Jaret Burkett
cd37ccfc2e Use gradient checkpointing on DFE models if set 2025-04-11 10:45:39 -06:00
Jaret Burkett
059155174a Added mask diffirential mask dialation for flex2. Handle video for the i2v adapter 2025-04-10 11:50:01 -06:00
Jaret Burkett
a8680c75eb Added initial support for finetuning wan i2v WIP 2025-04-07 20:34:38 -06:00
Jaret Burkett
7c21eac1b3 Added support for Lodestone Rock's Chroma model 2025-04-05 13:21:36 -06:00
Jaret Burkett
2b901cca39 Small tweaks and bug fixes and future proofing 2025-04-05 12:39:45 -06:00
Jaret Burkett
ac1ee559c5 Added bluring to mask for flex2 2025-04-02 07:55:51 -06:00
Jaret Burkett
a42c5a1de5 Adjust buckets for flex2 2025-04-02 06:47:41 -06:00
Jaret Burkett
c083a0e5ea Allow DFE to not have a VAE 2025-03-30 09:23:01 -06:00
Jaret Burkett
5365200da1 Added ability to add models to finetune as plugins. Also added flux2 new arch via that method. 2025-03-27 16:07:00 -06:00
Jaret Burkett
ce4c5291a0 Added experimental wavelet loss 2025-03-26 18:11:23 -06:00
Jaret Burkett
45be82d5d6 Handle inpainting training for control_lora adapter 2025-03-24 13:17:47 -06:00