Files
ik_llama.cpp/github-data/issues/597 - Feature Request_ Add THUDM_GLM-4-MoE-100B-A10B support.md
2025-07-23 13:31:53 +02:00

1.3 KiB

#597 - Feature Request: Add THUDM/GLM-4-MoE-100B-A10B support

Author ubergarm
State Open
Created 2025-07-10
Updated 2025-07-14

Description

The THUDM dev zRzRzRzRzRzRzR seems to be adding support for a new yet unreleased THUDM/GLM-4-MoE-100B-A10B model architechture to vLLM currently here

It is not confirmed, but this demo might be hosting the model currently: https://chat.z.ai/

Some more speculation on r/LocalLLaMA here as well.

If it looks promising, I might try to add support for this nice sized MoE when it is ready.


💬 Conversation

👤 arch-btw commented the 2025-07-14 at 23:51:59:

Yes, I look forward to this release myself!

Just a heads up though, the name appears to be a placeholder:

Image

From here.