tabby/crates/llama-cpp-bindings
Meng Zhang 9344c32b31
fix: when there's an error happens in background inference loop, it should exit the process (#713)
2023-11-06 20:41:49 +00:00
..
include refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
llama.cpp@f858db8db3 chore: clear cache when there's no active requests 2023-10-29 16:30:30 -07:00
src fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 20:41:49 +00:00
Cargo.toml fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 20:41:49 +00:00
build.rs fix: fix docker build 2023-10-27 21:25:45 -07:00