tabby/crates/llama-cpp-bindings
Meng Zhang f72b87bf79 fix: deadlock between background job and requests (#720)
* fix: deadlock between background job and requests

* refactor: extract LlamaService
2023-11-07 13:12:13 -08:00
..
include refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
llama.cpp@75fb6f2ba0 fix: support cpu only run in llama.cpp cuda build 2023-11-06 23:02:31 -08:00
src fix: deadlock between background job and requests (#720) 2023-11-07 13:12:13 -08:00
Cargo.toml Release 0.5.3 2023-11-07 00:57:48 -08:00
build.rs fix: fix docker build 2023-10-27 21:25:45 -07:00