Commit Graph

762 Commits

Author SHA1 Message Date
anzz1
a990294c27 [main] fix infinite generation (-n == -1) (#523) 2023-03-26 16:06:10 +03:00
Harald Fernengel
85e558b4ad Exit from interactive mode if input stream is bad (#491)
Allow exiting the interactive prompt also with CTRL-D on Unix and CTRL-Z
on Windows.
2023-03-26 08:25:46 +03:00
anzz1
f8eb92869e (Windows) Set console to UTF-8 on init (#420)
Sets console codepage to 65001 (CP_UTF8) on start for both input and output, should fix problems with UTF-8 characters.
2023-03-25 22:29:22 +02:00
Georgi Gerganov
2e01c018d2 Fix colors enabling on WIN32 2023-03-25 21:53:39 +02:00
Georgi Gerganov
9fe0e95688 If n_predict == -1, generate forever 2023-03-25 21:51:41 +02:00
Georgi Gerganov
310d5d09a3 Inifinite generation via context swapping (#71) 2023-03-25 21:36:22 +02:00
Georgi Gerganov
3468a153ba Cleanup STL headers + fix embedding examples + minor stuff 2023-03-25 20:51:14 +02:00
Georgi Gerganov
9d678e17dc Move chat scripts into "./examples" 2023-03-25 20:37:09 +02:00
Georgi Gerganov
84db7c0b8f Overhaul the examples structure
- main -> examples
- utils -> examples (renamed to "common")
- quantize -> examples
- separate tools for "perplexity" and "embedding"

Hope I didn't break something !
2023-03-25 20:26:40 +02:00
Georgi Gerganov
ba186f7f64 Immediately start processing the prompt before user input has been provided (#476) 2023-03-24 23:17:58 +02:00
Mathieu Nayrolles
410097f4c8 fix typo in chatLLaMa (#368)
The prompt contains a typo where 'alound' is used instead of 'aloud'.
2023-03-21 22:52:27 +02:00
Jean-Christophe Hoelt
6b596e4215 Add chatLLaMa script (#198)
* Add chatLLaMa script

* Fix shellcheck errors and do some cleanup

* Move chatLLaMa script to `examples` directory

* Reduce chatLLaMa context size to 2048

Ref d7def1a752

* Include n_predict to 2048 in examples/chatLLaMa
2023-03-21 18:23:15 +02:00