Model: Fix logit bias token checks

Accidentally checked on the token bias tensor which didn't contain
the token IDs. Check if the index exists on the id_to_piece list
instead.

Signed-off-by: kingbri <bdashore3@proton.me>
This commit is contained in:
kingbri
2024-02-22 21:44:15 -05:00
parent 5a23b9ebc9
commit 360802762c

View File

@@ -749,12 +749,12 @@ class ExllamaV2Container:
)
# Map logits to the tensor with their biases
for token, bias in logit_bias.items():
if token in gen_settings.token_bias:
gen_settings.token_bias[token] = bias
for token_id, bias in logit_bias.items():
if 0 <= token_id < len(self.tokenizer.id_to_piece):
gen_settings.token_bias[token_id] = bias
else:
logger.warning(
f"Logit bias: Token {token} not present "
f"Logit bias: Token {token_id} not present "
"in the model's vocab. Skipping."
)