Files
stable-diffusion-webui-forge/backend
DenOfEquity cc378589a4 (T5) pad chunks to length of largest chunk (#1990)
When the prompt is chunked using the BREAK keyword, chunks will be padded to the minimum size of 256 tokens - but chunks can be longer. torch.stack then fails if all chunks are not the same size, so find the largest and pad all to match.
#1988 (doesn't quite ID the real issue, prompts longer than 255 tokens work fine)
2024-10-07 11:37:06 +01:00
..
2024-08-19 06:30:49 -07:00
2024-08-11 17:31:12 -07:00
2024-08-03 15:19:45 -07:00
2024-10-06 14:33:47 +01:00
fix
2024-09-08 17:24:53 -07:00
2024-09-09 14:30:22 -03:00
2024-08-19 11:08:01 -07:00
2024-08-15 00:55:49 -07:00
2024-08-31 10:55:19 -07:00
2024-08-31 11:31:02 -07:00
2024-08-29 19:05:48 -07:00
2024-08-30 15:18:21 -07:00
2024-07-29 11:29:29 -06:00
2024-08-08 18:45:36 -07:00
2024-08-30 15:18:21 -07:00

WIP Backend for Forge