tabby/crates/llama-cpp-bindings
Meng Zhang 64e0abb8cc fix(llama.cpp): wrongly index for n_seq in warmup 2023-11-04 17:53:22 -07:00
..
include refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
llama.cpp@f858db8db3 chore: clear cache when there's no active requests 2023-10-29 16:30:30 -07:00
src fix(llama.cpp): wrongly index for n_seq in warmup 2023-11-04 17:53:22 -07:00
Cargo.toml Release 0.6.0-dev 2023-11-03 18:04:12 -07:00
build.rs fix: fix docker build 2023-10-27 21:25:45 -07:00