tabby/crates/llama-cpp-bindings
Meng Zhang 89a63dbf33
fix: when send failed, treat the request as stopped (#673)
2023-10-30 06:27:09 +00:00
..
include feat: support continuous batching in llama.cpp backend (#659) 2023-10-28 23:37:05 -07:00
llama.cpp@f858db8db3 chore: clear cache when there's no active requests 2023-10-29 16:30:30 -07:00
src fix: when send failed, treat the request as stopped (#673) 2023-10-30 06:27:09 +00:00
Cargo.toml feat: switch cuda backend to llama.cpp (#656) 2023-10-27 13:41:22 -07:00
build.rs fix: fix docker build 2023-10-27 21:25:45 -07:00