tabby/crates/llama-cpp-bindings/src
Meng Zhang 7bd99d14c0
feat: support continuous batching in llama.cpp backend (#659)
* refactor: switch back to llama batch interface

* feat: support cont batching
2023-10-28 23:37:05 -07:00
..
engine.cc feat: support continuous batching in llama.cpp backend (#659) 2023-10-28 23:37:05 -07:00
lib.rs feat: support continuous batching in llama.cpp backend (#659) 2023-10-28 23:37:05 -07:00