tabby/crates
Meng Zhang 296342efd8
refactor: use llama.cpp tokenizer (#683)
* refactor: switch to llama.cpp tokenizer to simplify implementation

* refactor: remove tokenizer dependency from tabby

* refactor: renaming decoding to stop condition

* refactor: remove tokenizer dependency

* refactor: remove submodule

* chore: update formatting

* move tokenization to c++
2023-10-31 22:16:09 +00:00
..
http-api-bindings fix: align with fastchat mainstream (#670) 2023-10-29 19:31:46 -07:00
llama-cpp-bindings refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
rust-cxx-cmake-bridge Release 0.5.0-dev 2023-10-24 13:05:33 -07:00
tabby refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
tabby-common refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
tabby-download refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
tabby-inference refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
tabby-scheduler Revert "feat: supports PHP (#634)" 2023-10-28 23:02:10 -07:00