tabby/crates/llama-cpp-bindings/src
Meng Zhang 16f47005dd fix: llama.cpp queuing logic 2023-11-09 00:25:31 -08:00
..
engine.cc fix: llama.cpp requires kv cache to be N_CTX * parallelism (#714) 2023-11-06 23:02:28 -08:00
lib.rs fix: deadlock between background job and requests (#720) 2023-11-07 13:12:13 -08:00
llama.rs fix: llama.cpp queuing logic 2023-11-09 00:25:31 -08:00
utils.rs fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 23:02:23 -08:00