mirror of
https://github.com/turboderp-org/exllamav2.git
synced 2026-04-28 18:21:36 +00:00
Fixed minor typo in convert.md doc (#463)
changed '64 GB or RAM' to '64 GB of RAM'
This commit is contained in:
@@ -149,7 +149,7 @@ catastrophically.
|
||||
|
||||
### Hardware requirements
|
||||
|
||||
Roughly speaking, you'll need about 64 GB or RAM and 24 GB of VRAM to convert a 70B model, while 7B seems to require
|
||||
Roughly speaking, you'll need about 64 GB of RAM and 24 GB of VRAM to convert a 70B model, while 7B seems to require
|
||||
about 16 GB of RAM and about 8 GB of VRAM.
|
||||
|
||||
The deciding factor for the memory requirement is the *width* of the model rather than the depth, so 120B models that
|
||||
|
||||
Reference in New Issue
Block a user