This website requires JavaScript.
Explore
Help
Register
Sign In
ikawrakow
/
ik_llama.cpp
Watch
1
Star
0
Fork
0
You've already forked ik_llama.cpp
mirror of
https://github.com/ikawrakow/ik_llama.cpp.git
synced
2026-01-26 17:20:01 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
5ac2fe8ccfdb7660f81cdc3cb9a06fb2c92dc5db
ik_llama.cpp
/
tests
History
Georgi Gerganov
5ac2fe8ccf
tests : fix test-grad0
2023-07-05 20:20:25 +03:00
..
CMakeLists.txt
ggml : implement backward pass for llama + small training-llama-from-scratch example (
#1360
)
2023-05-13 15:56:40 +03:00
test-double-float.c
all : be more strict about converting float to double (
#458
)
2023-03-28 19:48:20 +03:00
test-grad0.c
tests : fix test-grad0
2023-07-05 20:20:25 +03:00
test-opt.c
ggml : implement backward pass for llama + small training-llama-from-scratch example (
#1360
)
2023-05-13 15:56:40 +03:00
test-quantize-fns.cpp
ggml : generalize
quantize_fns
for simpler FP16 handling (
#1237
)
2023-07-05 19:13:06 +03:00
test-quantize-perf.cpp
ggml : generalize
quantize_fns
for simpler FP16 handling (
#1237
)
2023-07-05 19:13:06 +03:00
test-sampling.cpp
llama : fix top-p sampling to match the canonical definition (
#1953
)
2023-06-24 13:15:01 +03:00
test-tokenizer-0.cpp
llama : make model stateless and context stateful (llama_state) (
#1797
)
2023-06-24 11:47:58 +03:00