Commit Graph

39 Commits

Author SHA1 Message Date
Jaret Burkett
7fed4ea761 fixed huge flux training bug. Added ability to use an assistatn lora 2024-08-14 10:14:13 -06:00
Jaret Burkett
f7cf2f866f Make 100% sure lora alpha matches for flux 2024-08-13 14:24:03 -06:00
Jaret Burkett
c2424087d6 8 bit training working on flux 2024-08-06 11:53:27 -06:00
Jaret Burkett
187663ab55 Use peft format for flux loras so they are compatible with diffusers. allow loading an assistant lora 2024-08-05 14:34:37 -06:00
Jaret Burkett
87ba867fdc Added flux training. Still a WIP. Wont train right without rectified flow working right 2024-08-02 15:00:30 -06:00
Jaret Burkett
0bc4d555c7 A lot of pixart sigma training tweaks 2024-07-28 11:23:18 -06:00
Jaret Burkett
e4558dff4b Partial implementation for training auraflow. 2024-07-12 12:11:38 -06:00
Jaret Burkett
cab8a1c7b8 WIP to add the caption_proj weight to pixart sigma TE adapter 2024-07-06 13:00:21 -06:00
Jaret Burkett
acb06d6ff3 Bug fixes 2024-07-03 10:56:34 -06:00
Jaret Burkett
3072d20f17 Add ability to include conv_in and conv_out to full train when doing a lora 2024-06-29 14:54:50 -06:00
Jaret Burkett
5d47244c57 Added support for pixart sigma loras 2024-06-16 11:56:30 -06:00
Jaret Burkett
37cebd9458 WIP Ilora 2024-06-14 09:31:01 -06:00
Jaret Burkett
bd10d2d668 Some work on sd3 training. Not working 2024-06-13 12:19:16 -06:00
Jaret Burkett
cb5d28cba9 Added working ilora trainer 2024-06-12 09:33:45 -06:00
Jaret Burkett
1bd94f0f01 Added early DoRA support, but will change shortly. Dont use right now. 2024-02-23 05:55:41 -07:00
Jaret Burkett
93b52932c1 Added training for pixart-a 2024-02-13 16:00:04 -07:00
Jaret Burkett
92b9c71d44 Many bug fixes. Ip adapter bug fixes. Added noise to unconditional, it works better. added an ilora adapter for 1 shotting LoRAs 2024-01-28 08:20:03 -07:00
Jaret Burkett
5276975fb0 Added additional config options for custom plugins I needed 2024-01-15 08:31:09 -07:00
Jaret Burkett
39870411d8 More guidance work. Improved LoRA module resolver for unet. Added vega mappings and LoRA training for it. Various other bigfixes and changes 2023-12-15 06:02:10 -07:00
Jaret Burkett
6f3e0d5af2 Improved lorm extraction and training 2023-10-28 08:21:59 -06:00
Jaret Burkett
002279cec3 Allow short and long caption combinations like form the new captioning system. Merge the network into the model before inference and reextract when done. Doubles inference speed on locon models during inference. allow splitting a batch into individual components and run them through alone. Basicallt gradient accumulation with single batch size. 2023-10-24 16:02:07 -06:00
Jaret Burkett
cac8754399 Allow for training loras on onle one text encoder for sdxl 2023-10-06 08:11:56 -06:00
Jaret Burkett
569d7464d5 implemented device placement preset system more places. Vastly improved speed on setting network multiplier and activating network. Fixed timing issues on progress bar 2023-09-14 08:31:54 -06:00
Jaret Burkett
be804c9cf5 Save embeddings as their trigger to match auto and comfy style loading. Also, FINALLY found why gradients were wonkey and fixed it. The root problem is dropping out of network state before backward pass. 2023-09-09 12:02:07 -06:00
Jaret Burkett
408c50ead1 actually got gradient checkpointing working, again, again, maybe 2023-09-09 11:27:42 -06:00
Jaret Burkett
436bf0c6a3 Added experimental concept replacer, replicate converter, bucket maker, and other goodies 2023-09-06 18:50:32 -06:00
Jaret Burkett
f84500159c Fixed issue with lora layer check 2023-09-04 14:27:37 -06:00
Jaret Burkett
64a5441832 Fully tested and now supporting locon on sdxl. If you have the ram 2023-09-04 14:05:10 -06:00
Jaret Burkett
a4c3507a62 Added LoCON from LyCORIS 2023-09-04 08:48:07 -06:00
Jaret Burkett
fa8fc32c0a Corrected key saving and loading to better match kohya 2023-09-04 00:22:34 -06:00
Jaret Burkett
bd758ff203 Cleanup and small bug fixes 2023-08-29 05:45:49 -06:00
Jaret Burkett
71da78c8af improved normalization for a network with varrying batch network weights 2023-08-28 12:42:57 -06:00
Jaret Burkett
c446f768ea Huge memory optimizations, many big fixes 2023-08-27 17:48:02 -06:00
Jaret Burkett
9b164a8688 Fixed issue with bucket dataloader corpping in too much. Added normalization capabilities to LoRA modules. Testing effects, but should prevent them from burning and also make them more compatable with stacking many LoRAs 2023-08-27 09:40:01 -06:00
Jaret Burkett
379992d89e Various bug fixes and improvements 2023-08-12 05:59:50 -06:00
Jaret Burkett
8c90fa86c6 Complete reqork of how slider training works and optimized it to hell. Can run entire algorythm in 1 batch now with less VRAM consumption than a quarter of it used to take 2023-08-05 18:46:08 -06:00
Jaret Burkett
1a25b275c8 Did some work on SD rescaler. Need to run a long test on it eventually. 2023-08-02 07:59:27 -06:00
Jaret Burkett
cb70c03273 SDXL should be working, but I broke something where it is not converging. 2023-07-25 13:50:59 -06:00
Jaret Burkett
ddcd9069e1 Base for loopback lora training setup, still working on proper sliders 2023-07-21 18:26:02 -06:00