tabby/crates/llama-cpp-bindings
Meng Zhang 8c669dee8e
fix: llama.cpp queuing logic (#741)
2023-11-09 08:29:54 +00:00
..
include feat: add --parallelism to control throughput and vram usage (#727) 2023-11-08 18:31:22 +00:00
llama.cpp@efbd009d2f feat: sync llama.cpp to latest 2023-11-08 16:06:09 -08:00
src fix: llama.cpp queuing logic (#741) 2023-11-09 08:29:54 +00:00
Cargo.toml fix: when there's an error happens in background inference loop, it should exit the process (#713) 2023-11-06 20:41:49 +00:00
build.rs fix: fix docker build 2023-10-27 21:25:45 -07:00