mirror of
https://github.com/theroyallab/tabbyAPI.git
synced 2026-03-14 15:57:27 +00:00
Model: Fix max_seq_len fallbacks
The rope alpha calculation caused an error if max seq len isn't provided. This is because the model's max sequence length was not stored as the target for alpha calculation. Signed-off-by: kingbri <8082010+kingbri1@users.noreply.github.com>
This commit is contained in:
@@ -237,7 +237,8 @@ class ExllamaV2Container(BaseModelContainer):
|
||||
base_seq_len = self.config.max_seq_len
|
||||
|
||||
# Set the target seq len if present
|
||||
target_seq_len = kwargs.get("max_seq_len")
|
||||
# Fallback to base_seq_len if not provided
|
||||
target_seq_len = unwrap(kwargs.get("max_seq_len"), base_seq_len)
|
||||
|
||||
# Set the rope scale
|
||||
self.config.scale_pos_emb = unwrap(
|
||||
|
||||
Reference in New Issue
Block a user