tabby/crates/llama-cpp-bindings
Meng Zhang 7330d75de6 chore: clear cache when there's no active requests 2023-10-29 16:30:30 -07:00
..
include feat: support continuous batching in llama.cpp backend (#659) 2023-10-28 23:37:05 -07:00
llama.cpp@f858db8db3 chore: clear cache when there's no active requests 2023-10-29 16:30:30 -07:00
src chore: clear cache when there's no active requests 2023-10-29 16:30:30 -07:00
Cargo.toml feat: switch cuda backend to llama.cpp (#656) 2023-10-27 13:41:22 -07:00
build.rs fix: fix docker build 2023-10-27 21:25:45 -07:00