tabby/crates/llama-cpp-bindings
Meng Zhang 1ad0d39903
fix: deadlock between background job and requests (#720)
* fix: deadlock between background job and requests

* refactor: extract LlamaService
2023-11-07 13:11:28 -08:00
..
include refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
llama.cpp@75fb6f2ba0 fix: support cpu only run in llama.cpp cuda build 2023-11-06 22:59:24 -08:00
src fix: deadlock between background job and requests (#720) 2023-11-07 13:11:28 -08:00
Cargo.toml fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 20:41:49 +00:00
build.rs