Commit Graph

138 Commits

Author SHA1 Message Date
Plat
79b4e04b80 Feat: Wandb logging (#95)
* wandb logging

* fix: start logging before train loop

* chore: add wandb dir to gitignore

* fix: wrap wandb functions

* fix: forget to send last samples

* chore: use valid type

* chore: use None when not type-checking

* chore: resolved complicated logic

* fix: follow log_every

---------

Co-authored-by: Plat <github@p1at.dev>
Co-authored-by: Jaret Burkett <jaretburkett@gmail.com>
2024-09-19 20:01:01 -06:00
Jaret Burkett
fc34a69bec Ignore guidance embed when full tuning flux. adjust block scaler to decat to 1.0. Add MLP resampler for reducing vision adapter tokens 2024-09-09 16:24:46 -06:00
Jaret Burkett
e5fadddd45 Added ability to do prompt attn masking for flux 2024-09-02 17:29:36 -06:00
Jaret Burkett
60232def91 Made peleminary arch for flux ip adapter training 2024-08-28 08:55:39 -06:00
Jaret Burkett
338c77d677 Fixed breaking change with diffusers. Allow flowmatch on normal stable diffusion models. 2024-08-22 14:36:22 -06:00
Jaret Burkett
c45887192a Unload interum weights when doing multi lora fuse 2024-08-18 09:35:10 -06:00
Jaret Burkett
13a965a26c Fixed bad key naming on lora fuse I just pushed 2024-08-18 09:33:31 -06:00
Jaret Burkett
f944eeaa4d Fuse flux schnell assistant adapter in pieces when doing lowvram to drastically speed ip up from minutes to seconds. 2024-08-18 09:09:11 -06:00
Jaret Burkett
81899310f8 Added support for training on flux schnell. Added example config and instructions for training on flux schnell 2024-08-17 06:58:39 -06:00
Jaret Burkett
f9179540d2 Flush after sampling 2024-08-16 17:29:42 -06:00
Jaret Burkett
452e0e286d For lora assisted training, merge in before quantizing then sample with schnell at -1 weight. Almost doubles training speed with lora adapter. 2024-08-16 17:28:44 -06:00
Jaret Burkett
7fed4ea761 fixed huge flux training bug. Added ability to use an assistatn lora 2024-08-14 10:14:13 -06:00
Jaret Burkett
599fafe01f Allow user to have the full flux checkpoint local 2024-08-12 09:57:16 -06:00
Jaret Burkett
6490a326e5 Fixed issue for vaes without a shift 2024-08-11 10:30:55 -06:00
Jaret Burkett
ec1ea7aa0e Added support for training on primary gpu with low_vram flag. Updated example script to remove creepy horse sample at that seed 2024-08-11 09:54:30 -06:00
Jaret Burkett
b3e03295ad Reworked flux pred. Again 2024-08-08 13:06:34 -06:00
Jaret Burkett
acafe9984f Adjustments to loading of flux. Added a feedback to ema 2024-08-07 13:17:26 -06:00
Jaret Burkett
c2424087d6 8 bit training working on flux 2024-08-06 11:53:27 -06:00
Jaret Burkett
272c8608c2 Make a CFG version of flux pipeline 2024-08-05 16:35:53 -06:00
Jaret Burkett
187663ab55 Use peft format for flux loras so they are compatible with diffusers. allow loading an assistant lora 2024-08-05 14:34:37 -06:00
Jaret Burkett
edb7e827ee Adjusted flow matching so target noise multiplier works properly with it. 2024-08-05 11:40:05 -06:00
Jaret Burkett
0ea27011d5 Bug fix 2024-08-04 11:07:19 -06:00
Jaret Burkett
f321de7bdb Setup to retrain guidance embedding for flux. Use defualt timestep distribution for flux 2024-08-04 10:37:23 -06:00
Jaret Burkett
9beea1c268 Flux training should work now... maybe 2024-08-03 09:17:34 -06:00
Jaret Burkett
369aa143bc Only train a few blocks on flux (for now) 2024-08-03 07:02:27 -06:00
Jaret Burkett
87ba867fdc Added flux training. Still a WIP. Wont train right without rectified flow working right 2024-08-02 15:00:30 -06:00
Jaret Burkett
e81e19fd0f Added target_norm_std which is a game changer 2024-07-28 16:08:33 -06:00
Jaret Burkett
0bc4d555c7 A lot of pixart sigma training tweaks 2024-07-28 11:23:18 -06:00
Jaret Burkett
11e426fdf1 Various features and fixes. Too much brain fog to do a proper description 2024-07-18 07:34:14 -06:00
Jaret Burkett
e4558dff4b Partial implementation for training auraflow. 2024-07-12 12:11:38 -06:00
Jaret Burkett
045e4a6e15 Save entire pixart model again 2024-07-07 07:56:48 -06:00
Jaret Burkett
cab8a1c7b8 WIP to add the caption_proj weight to pixart sigma TE adapter 2024-07-06 13:00:21 -06:00
Jaret Burkett
657fd09f25 Added more control over Sigma sizes 2024-06-26 08:57:53 -06:00
Jaret Burkett
7165f2d25a Work to omprove pixart training 2024-06-23 20:46:48 +00:00
Jaret Burkett
5d47244c57 Added support for pixart sigma loras 2024-06-16 11:56:30 -06:00
Jaret Burkett
696f73c30d Removed variant 2024-06-14 17:09:47 -06:00
Jaret Burkett
37cebd9458 WIP Ilora 2024-06-14 09:31:01 -06:00
Jaret Burkett
bd10d2d668 Some work on sd3 training. Not working 2024-06-13 12:19:16 -06:00
Jaret Burkett
3f3636b788 Bug fixes and little improvements here and there. 2024-06-08 06:24:20 -06:00
Jaret Burkett
68b7e159bc Bug Fixes 2024-05-17 08:41:20 -06:00
Jaret Burkett
10e1ecf1e8 Added single value adapter training 2024-04-28 06:04:47 -06:00
Jaret Burkett
b96913d73c Improvements to dataloader 2024-04-27 09:28:28 -06:00
Jaret Burkett
5da3613e0b Bug fixes and minor features 2024-04-25 06:14:31 -06:00
Jaret Burkett
5a70b7f38d Added pixart sigma support, but it wont work until i address breaking changes with lora code in diffusers so it can be upgraded. 2024-04-20 10:46:56 -06:00
Jaret Burkett
7284aab7c0 Added specialized scaler training to ip adapters 2024-04-05 08:17:09 -06:00
Jaret Burkett
427847ac4c Small tweaks and fixes for specialized ip adapter training 2024-03-26 11:35:26 -06:00
Jaret Burkett
016687bda1 Adapter work. Bug fixes. Auto adjust LR when resuming optimizer. 2024-03-17 10:21:47 -06:00
Jaret Burkett
72de68d8aa WIP on clip vision encoder 2024-03-13 07:24:08 -06:00
Jaret Burkett
337945de9a Added this not that guidance. Added ability to replace prompts. 2024-02-28 20:10:14 -07:00
Jaret Burkett
561914d8e6 Removed old code for fixing multistep sampler that is no longer needed 2024-02-25 11:53:35 -07:00