tabby/crates/llama-cpp-bindings/include
Meng Zhang 7bd99d14c0
feat: support continuous batching in llama.cpp backend (#659)
* refactor: switch back to llama batch interface

* feat: support cont batching
2023-10-28 23:37:05 -07:00
..
engine.h feat: support continuous batching in llama.cpp backend (#659) 2023-10-28 23:37:05 -07:00