Commit Graph

57 Commits

Author SHA1 Message Date
Jaret Burkett
55b8b0e23e Fix issue where ARA was not working when using memory manager 2025-10-07 13:39:44 -06:00
Jaret Burkett
af6fdaaaf9 Add ability to train a full rank LoRA. (experimental) 2025-09-09 07:36:25 -06:00
Jaret Burkett
adc31ec77d Small updates and bug fixes for various things 2025-06-03 20:08:35 -06:00
Jaret Burkett
bfe29e2151 Removed all submodules. Submodule free now, yay. 2025-04-18 10:39:15 -06:00
Jaret Burkett
1e0bff653c Fix new bug I accidently introduced with lora 2025-04-13 21:15:07 -06:00
Jaret Burkett
ca3ce0f34c Make it easier to designate lora blocks for new models. Improve i2v adapter speed. Fix issue with i2v adapter where cached torch tensor was wrong range. 2025-04-13 13:49:13 -06:00
Jaret Burkett
2b901cca39 Small tweaks and bug fixes and future proofing 2025-04-05 12:39:45 -06:00
Jaret Burkett
bbfd6ef0fe Fixed bug that prevented using schnell training adapter 2025-03-19 10:25:12 -06:00
Jaret Burkett
e6739f7eb2 Convert wan lora weights on save to be something comfy can handle 2025-03-08 12:55:11 -07:00
Jaret Burkett
391cf80fea Added training for Wan2.1. Not finalized, wait. 2025-03-07 13:53:44 -07:00
Jaret Burkett
6f6fb90812 Added cogview4. Loss still needs work. 2025-03-04 18:43:52 -07:00
Jaret Burkett
1f3f45a48d Bugfixes 2025-03-03 08:22:15 -07:00
Jaret Burkett
b16819f8e7 Added LoKr support 2025-03-02 06:57:50 -07:00
Jaret Burkett
9f6030620f Dataset uploads working 2025-02-20 12:47:01 -07:00
Jaret Burkett
9a7266275d Wokr on lumina2 2025-02-08 14:52:39 -07:00
Jaret Burkett
d138f07365 Imitial lumina3 support 2025-02-08 10:59:53 -07:00
Jaret Burkett
3400882a80 Added preliminary support for SD3.5-large lora training 2024-10-22 12:21:36 -06:00
Jaret Burkett
3e71a99df0 Check for contains only against clean name for lora, not the adjusted one 2024-08-31 07:44:13 -06:00
Jaret Burkett
7fed4ea761 fixed huge flux training bug. Added ability to use an assistatn lora 2024-08-14 10:14:13 -06:00
Jaret Burkett
f7cf2f866f Make 100% sure lora alpha matches for flux 2024-08-13 14:24:03 -06:00
Jaret Burkett
c2424087d6 8 bit training working on flux 2024-08-06 11:53:27 -06:00
Jaret Burkett
187663ab55 Use peft format for flux loras so they are compatible with diffusers. allow loading an assistant lora 2024-08-05 14:34:37 -06:00
Jaret Burkett
87ba867fdc Added flux training. Still a WIP. Wont train right without rectified flow working right 2024-08-02 15:00:30 -06:00
Jaret Burkett
0bc4d555c7 A lot of pixart sigma training tweaks 2024-07-28 11:23:18 -06:00
Jaret Burkett
e4558dff4b Partial implementation for training auraflow. 2024-07-12 12:11:38 -06:00
Jaret Burkett
cab8a1c7b8 WIP to add the caption_proj weight to pixart sigma TE adapter 2024-07-06 13:00:21 -06:00
Jaret Burkett
acb06d6ff3 Bug fixes 2024-07-03 10:56:34 -06:00
Jaret Burkett
3072d20f17 Add ability to include conv_in and conv_out to full train when doing a lora 2024-06-29 14:54:50 -06:00
Jaret Burkett
5d47244c57 Added support for pixart sigma loras 2024-06-16 11:56:30 -06:00
Jaret Burkett
37cebd9458 WIP Ilora 2024-06-14 09:31:01 -06:00
Jaret Burkett
bd10d2d668 Some work on sd3 training. Not working 2024-06-13 12:19:16 -06:00
Jaret Burkett
cb5d28cba9 Added working ilora trainer 2024-06-12 09:33:45 -06:00
Jaret Burkett
1bd94f0f01 Added early DoRA support, but will change shortly. Dont use right now. 2024-02-23 05:55:41 -07:00
Jaret Burkett
93b52932c1 Added training for pixart-a 2024-02-13 16:00:04 -07:00
Jaret Burkett
92b9c71d44 Many bug fixes. Ip adapter bug fixes. Added noise to unconditional, it works better. added an ilora adapter for 1 shotting LoRAs 2024-01-28 08:20:03 -07:00
Jaret Burkett
5276975fb0 Added additional config options for custom plugins I needed 2024-01-15 08:31:09 -07:00
Jaret Burkett
39870411d8 More guidance work. Improved LoRA module resolver for unet. Added vega mappings and LoRA training for it. Various other bigfixes and changes 2023-12-15 06:02:10 -07:00
Jaret Burkett
6f3e0d5af2 Improved lorm extraction and training 2023-10-28 08:21:59 -06:00
Jaret Burkett
002279cec3 Allow short and long caption combinations like form the new captioning system. Merge the network into the model before inference and reextract when done. Doubles inference speed on locon models during inference. allow splitting a batch into individual components and run them through alone. Basicallt gradient accumulation with single batch size. 2023-10-24 16:02:07 -06:00
Jaret Burkett
cac8754399 Allow for training loras on onle one text encoder for sdxl 2023-10-06 08:11:56 -06:00
Jaret Burkett
569d7464d5 implemented device placement preset system more places. Vastly improved speed on setting network multiplier and activating network. Fixed timing issues on progress bar 2023-09-14 08:31:54 -06:00
Jaret Burkett
be804c9cf5 Save embeddings as their trigger to match auto and comfy style loading. Also, FINALLY found why gradients were wonkey and fixed it. The root problem is dropping out of network state before backward pass. 2023-09-09 12:02:07 -06:00
Jaret Burkett
408c50ead1 actually got gradient checkpointing working, again, again, maybe 2023-09-09 11:27:42 -06:00
Jaret Burkett
436bf0c6a3 Added experimental concept replacer, replicate converter, bucket maker, and other goodies 2023-09-06 18:50:32 -06:00
Jaret Burkett
f84500159c Fixed issue with lora layer check 2023-09-04 14:27:37 -06:00
Jaret Burkett
64a5441832 Fully tested and now supporting locon on sdxl. If you have the ram 2023-09-04 14:05:10 -06:00
Jaret Burkett
a4c3507a62 Added LoCON from LyCORIS 2023-09-04 08:48:07 -06:00
Jaret Burkett
fa8fc32c0a Corrected key saving and loading to better match kohya 2023-09-04 00:22:34 -06:00
Jaret Burkett
bd758ff203 Cleanup and small bug fixes 2023-08-29 05:45:49 -06:00
Jaret Burkett
71da78c8af improved normalization for a network with varrying batch network weights 2023-08-28 12:42:57 -06:00