tabby/crates/llama-cpp-bindings/src
Meng Zhang 1ad0d39903
fix: deadlock between background job and requests (#720)
* fix: deadlock between background job and requests

* refactor: extract LlamaService
2023-11-07 13:11:28 -08:00
..
engine.cc fix: llama.cpp requires kv cache to be N_CTX * parallelism (#714) 2023-11-07 06:16:36 +00:00
lib.rs fix: deadlock between background job and requests (#720) 2023-11-07 13:11:28 -08:00
llama.rs fix: deadlock between background job and requests (#720) 2023-11-07 13:11:28 -08:00
utils.rs fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 20:41:49 +00:00