6 Commits

Author SHA1 Message Date
DenOfEquity
cc378589a4 (T5) pad chunks to length of largest chunk (#1990)
When the prompt is chunked using the BREAK keyword, chunks will be padded to the minimum size of 256 tokens - but chunks can be longer. torch.stack then fails if all chunks are not the same size, so find the largest and pad all to match.
#1988 (doesn't quite ID the real issue, prompts longer than 255 tokens work fine)
2024-10-07 11:37:06 +01:00
lllyasviel
cfa5242a75 forge 2.0.0
see also discussions
2024-08-10 19:24:19 -07:00
layerdiffusion
a05a06b337 make results more consistent to A1111 2024-08-08 01:53:03 -07:00
layerdiffusion
78c65708ea fix t5 2024-08-07 21:55:00 -07:00
layerdiffusion
463cff0d89 fix t5 2024-08-07 21:10:23 -07:00
lllyasviel
14a759b5ca revise kernel 2024-08-07 13:28:12 -07:00