tabby/crates/llama-cpp-bindings/src
Meng Zhang e87e78b74c fix: llama.cpp requires kv cache to be N_CTX * parallelism (#714) 2023-11-06 23:02:28 -08:00
..
engine.cc fix: llama.cpp requires kv cache to be N_CTX * parallelism (#714) 2023-11-06 23:02:28 -08:00
lib.rs fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 23:02:23 -08:00
utils.rs fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 23:02:23 -08:00