mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced 2026-04-25 08:59:30 +00:00
1.3 KiB
1.3 KiB
✨ #597 - Feature Request: Add THUDM/GLM-4-MoE-100B-A10B support
| Author | ubergarm |
|---|---|
| State | ✅ Open |
| Created | 2025-07-10 |
| Updated | 2025-07-14 |
Description
The THUDM dev zRzRzRzRzRzRzR seems to be adding support for a new yet unreleased THUDM/GLM-4-MoE-100B-A10B model architechture to vLLM currently here
It is not confirmed, but this demo might be hosting the model currently: https://chat.z.ai/
Some more speculation on r/LocalLLaMA here as well.
If it looks promising, I might try to add support for this nice sized MoE when it is ready.
💬 Conversation
👤 arch-btw commented the 2025-07-14 at 23:51:59:
Yes, I look forward to this release myself!
Just a heads up though, the name appears to be a placeholder:
From here.