tabby/crates/llama-cpp-bindings
Erfan Safari 138b7459c5
feat: add LLAMA_CPP_N_THREADS env (#742)
* feat: add LLAMA_CPP_N_THREADS and LLAMA_CPP_N_THREADS_BATCH envs

* apply format

* improve: use LLAMA_CPP_N_THREADS for both n_threads and n_threads_batch

* Update crates/llama-cpp-bindings/src/engine.cc

---------

Co-authored-by: Meng Zhang <meng@tabbyml.com>
2023-11-09 19:54:23 +00:00
..
include feat: add --parallelism to control throughput and vram usage (#727) 2023-11-08 18:31:22 +00:00
llama.cpp@efbd009d2f feat: sync llama.cpp to latest 2023-11-08 16:06:09 -08:00
src feat: add LLAMA_CPP_N_THREADS env (#742) 2023-11-09 19:54:23 +00:00
Cargo.toml fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 20:41:49 +00:00
build.rs