Commit Graph

149 Commits

Author SHA1 Message Date
Jaret Burkett
22cd40d7b9 Improvements for full tuning flux. Added debugging launch config for vscode 2024-10-29 04:54:08 -06:00
Jaret Burkett
3400882a80 Added preliminary support for SD3.5-large lora training 2024-10-22 12:21:36 -06:00
Jaret Burkett
9f94c7b61e Added experimental param multiplier to the ema module 2024-10-22 09:25:52 -06:00
Jaret Burkett
ab22674980 Allow for a default caption file in the folder. Minor bug fixes. 2024-10-10 07:31:33 -06:00
Jaret Burkett
04424fe2d6 Added config setting to set the timestep type 2024-09-24 06:53:59 -06:00
Jaret Burkett
2776221497 Added option to cache empty prompt or trigger and unload text encoders while training 2024-09-21 20:54:09 -06:00
apolinário
bc693488eb fix diffusers codebase (#183) 2024-09-21 11:50:29 -06:00
Plat
79b4e04b80 Feat: Wandb logging (#95)
* wandb logging

* fix: start logging before train loop

* chore: add wandb dir to gitignore

* fix: wrap wandb functions

* fix: forget to send last samples

* chore: use valid type

* chore: use None when not type-checking

* chore: resolved complicated logic

* fix: follow log_every

---------

Co-authored-by: Plat <github@p1at.dev>
Co-authored-by: Jaret Burkett <jaretburkett@gmail.com>
2024-09-19 20:01:01 -06:00
Jaret Burkett
121a760c19 Added proper grad accumulation 2024-09-03 07:24:18 -06:00
Jaret Burkett
d44d4eb61a Added a new experimental linear weighing technique 2024-09-02 09:22:13 -06:00
apolinário
562405923f Update README.md for push_to_hub (#143)
Add diffusers examples and clarify how to use the model locally
2024-08-30 16:34:28 -06:00
apolinário
4d35a29c97 Add push_to_hub to the trainer (#109)
* add push_to_hub

* fix indentation

* indent again

* model_config

* allow samples to not exist

* repo creation fix

* dont show empty [] if widget doesnt exist

* dont submit the config and optimizer

* Unsafe to have tokens saved in the yaml file

* make sure to catch only the latest samples

* change name to slug

* formatting

* formatting

---------

Co-authored-by: multimodalart <joaopaulo.passos+multimodal@gmail.com>
2024-08-22 21:18:56 -06:00
Jaret Burkett
af108bb964 Bug fix with dataloader. Added a flag to completly disable sampling 2024-08-12 09:19:40 -06:00
Jaret Burkett
a6aa4b2c7d Added ability to set timesteps to linear for flowmatching schedule 2024-08-11 13:06:08 -06:00
Jaret Burkett
e69a520616 Reworked timestep distribution on flowmatch sampler when training. 2024-08-08 06:01:45 -06:00
Jaret Burkett
acafe9984f Adjustments to loading of flux. Added a feedback to ema 2024-08-07 13:17:26 -06:00
Jaret Burkett
c2424087d6 8 bit training working on flux 2024-08-06 11:53:27 -06:00
Jaret Burkett
edb7e827ee Adjusted flow matching so target noise multiplier works properly with it. 2024-08-05 11:40:05 -06:00
Jaret Burkett
f321de7bdb Setup to retrain guidance embedding for flux. Use defualt timestep distribution for flux 2024-08-04 10:37:23 -06:00
Jaret Burkett
9beea1c268 Flux training should work now... maybe 2024-08-03 09:17:34 -06:00
Jaret Burkett
87ba867fdc Added flux training. Still a WIP. Wont train right without rectified flow working right 2024-08-02 15:00:30 -06:00
Jaret Burkett
03613c523f Bugfixes and cleanup 2024-08-01 11:45:12 -06:00
Jaret Burkett
47744373f2 Change img multiplier math 2024-07-30 11:33:41 -06:00
Jaret Burkett
0bc4d555c7 A lot of pixart sigma training tweaks 2024-07-28 11:23:18 -06:00
Jaret Burkett
80aa2dbb80 New image generation img2img. various tweaks and fixes 2024-07-24 04:13:41 -06:00
Jaret Burkett
11e426fdf1 Various features and fixes. Too much brain fog to do a proper description 2024-07-18 07:34:14 -06:00
Jaret Burkett
58dffd43a8 Added caching to image sizes so we dont do it every time. 2024-07-15 19:07:41 -06:00
Jaret Burkett
e4558dff4b Partial implementation for training auraflow. 2024-07-12 12:11:38 -06:00
Jaret Burkett
c008405480 Added after model load hook 2024-07-09 15:34:48 -06:00
Jaret Burkett
acb06d6ff3 Bug fixes 2024-07-03 10:56:34 -06:00
Jaret Burkett
603ceca3ca added ema 2024-06-28 10:03:26 -06:00
Jaret Burkett
7165f2d25a Work to omprove pixart training 2024-06-23 20:46:48 +00:00
Jaret Burkett
5d47244c57 Added support for pixart sigma loras 2024-06-16 11:56:30 -06:00
Jaret Burkett
bd10d2d668 Some work on sd3 training. Not working 2024-06-13 12:19:16 -06:00
Jaret Burkett
cb5d28cba9 Added working ilora trainer 2024-06-12 09:33:45 -06:00
Jaret Burkett
3f3636b788 Bug fixes and little improvements here and there. 2024-06-08 06:24:20 -06:00
Jaret Burkett
5a45c709cd Work on ipadapters and custom adapters 2024-05-13 06:37:54 -06:00
Jaret Burkett
10e1ecf1e8 Added single value adapter training 2024-04-28 06:04:47 -06:00
Jaret Burkett
5da3613e0b Bug fixes and minor features 2024-04-25 06:14:31 -06:00
Jaret Burkett
7284aab7c0 Added specialized scaler training to ip adapters 2024-04-05 08:17:09 -06:00
Jaret Burkett
016687bda1 Adapter work. Bug fixes. Auto adjust LR when resuming optimizer. 2024-03-17 10:21:47 -06:00
Jaret Burkett
f1cb87fe9e fixed bug the kept learning rates the same 2024-03-06 09:23:32 -07:00
Jaret Burkett
b01e8d889a Added stochastic rounding to adafactor. ILora adjustments 2024-03-05 07:07:09 -07:00
Jaret Burkett
337945de9a Added this not that guidance. Added ability to replace prompts. 2024-02-28 20:10:14 -07:00
Jaret Burkett
1bd94f0f01 Added early DoRA support, but will change shortly. Dont use right now. 2024-02-23 05:55:41 -07:00
Jaret Burkett
93b52932c1 Added training for pixart-a 2024-02-13 16:00:04 -07:00
Jaret Burkett
92b9c71d44 Many bug fixes. Ip adapter bug fixes. Added noise to unconditional, it works better. added an ilora adapter for 1 shotting LoRAs 2024-01-28 08:20:03 -07:00
Jaret Burkett
f17ad8d794 various bug fixes. Created an contextual alpha mask module to calculate alpha mask 2024-01-18 16:34:27 -07:00
Jaret Burkett
eebd3c8212 Initial training script for photomaker training. Needs a little more work. 2024-01-15 18:46:26 -07:00
Jaret Burkett
5276975fb0 Added additional config options for custom plugins I needed 2024-01-15 08:31:09 -07:00