Commit Graph

30 Commits

Author SHA1 Message Date
Jaret Burkett
7fed4ea761 fixed huge flux training bug. Added ability to use an assistatn lora 2024-08-14 10:14:13 -06:00
Jaret Burkett
c2424087d6 8 bit training working on flux 2024-08-06 11:53:27 -06:00
Jaret Burkett
187663ab55 Use peft format for flux loras so they are compatible with diffusers. allow loading an assistant lora 2024-08-05 14:34:37 -06:00
Jaret Burkett
0bc4d555c7 A lot of pixart sigma training tweaks 2024-07-28 11:23:18 -06:00
Jaret Burkett
cab8a1c7b8 WIP to add the caption_proj weight to pixart sigma TE adapter 2024-07-06 13:00:21 -06:00
Jaret Burkett
3072d20f17 Add ability to include conv_in and conv_out to full train when doing a lora 2024-06-29 14:54:50 -06:00
Jaret Burkett
f965a1299f Fixed Dora implementation. Still highly experimental 2024-02-24 10:26:01 -07:00
Jaret Burkett
1bd94f0f01 Added early DoRA support, but will change shortly. Dont use right now. 2024-02-23 05:55:41 -07:00
Jaret Burkett
e074058faa Work on additional image embedding methods. Finalized zipper resampler. It works amazing 2024-02-10 09:00:05 -07:00
Jaret Burkett
e18e0cb5f8 Added comparitive loss when training clip encoder. Allow selecting clip layer. on ip adapter. Improvements to prior prediction 2024-02-05 07:40:03 -07:00
Jaret Burkett
39870411d8 More guidance work. Improved LoRA module resolver for unet. Added vega mappings and LoRA training for it. Various other bigfixes and changes 2023-12-15 06:02:10 -07:00
Jaret Burkett
eaa0fb6253 Tons of bug fixes and improvements to special training. Fixed slider training. 2023-12-09 16:38:10 -07:00
Jaret Burkett
d7e55b6ad4 Bug fixes, negative prompting during training, hardened catching 2023-11-24 07:25:11 -07:00
Jaret Burkett
6f3e0d5af2 Improved lorm extraction and training 2023-10-28 08:21:59 -06:00
Jaret Burkett
002279cec3 Allow short and long caption combinations like form the new captioning system. Merge the network into the model before inference and reextract when done. Doubles inference speed on locon models during inference. allow splitting a batch into individual components and run them through alone. Basicallt gradient accumulation with single batch size. 2023-10-24 16:02:07 -06:00
Jaret Burkett
63ceffae24 Massive speed increases and ram optimizations 2023-10-10 06:07:55 -06:00
Jaret Burkett
830e87cb87 Added IP adapter training. Not functioning correctly yet 2023-09-24 02:39:43 -06:00
Jaret Burkett
0f105690cc Added some further extendability for plugins 2023-09-19 05:41:44 -06:00
Jaret Burkett
27f343fc08 Added base setup for training t2i adapters. Currently untested, saw something else shiny i wanted to finish sirst. Added content_or_style to the training config. It defaults to balanced, which is standard uniform time step sampling. If style or content is passed, it will use cubic sampling for timesteps to favor timesteps that are beneficial for training them. for style, favor later timesteps. For content, favor earlier timesteps. 2023-09-16 08:30:38 -06:00
Jaret Burkett
569d7464d5 implemented device placement preset system more places. Vastly improved speed on setting network multiplier and activating network. Fixed timing issues on progress bar 2023-09-14 08:31:54 -06:00
Jaret Burkett
ae70200d3c Bug fixes, speed improvements, compatability adjustments withdiffusers updates 2023-09-13 07:03:53 -06:00
Jaret Burkett
e8583860ad Upgraded to dev for t2i on diffusers. Minor migrations to make it work. 2023-09-11 14:46:06 -06:00
Jaret Burkett
083cefa78c Bugfixes for slider reference 2023-09-10 18:36:23 -06:00
Jaret Burkett
708b07adb7 Fixed issue with interleaving when doing cfg 2023-09-10 10:26:58 -06:00
Jaret Burkett
be804c9cf5 Save embeddings as their trigger to match auto and comfy style loading. Also, FINALLY found why gradients were wonkey and fixed it. The root problem is dropping out of network state before backward pass. 2023-09-09 12:02:07 -06:00
Jaret Burkett
408c50ead1 actually got gradient checkpointing working, again, again, maybe 2023-09-09 11:27:42 -06:00
Jaret Burkett
b01ab5d375 FINALLY fixed gradient checkpointing issue. Big batches baby. 2023-09-08 15:21:46 -06:00
Jaret Burkett
3feb663a51 Switched to new bucket system that matched sdxl trained buckets. Fixed requirements. Updated embeddings to work with sdxl. Added method to train lora with an embedding at the trigger. Still testing but works amazingly well from what I can see 2023-09-07 13:06:18 -06:00
Jaret Burkett
64a5441832 Fully tested and now supporting locon on sdxl. If you have the ram 2023-09-04 14:05:10 -06:00
Jaret Burkett
a4c3507a62 Added LoCON from LyCORIS 2023-09-04 08:48:07 -06:00