This website requires JavaScript.
Explore
Help
Sign In
root
/
tabby
mirror of
https://gitee.com/zhang_1334717033/tabby.git
Watch
1
Star
0
Fork
You've already forked tabby
0
Code
Issues
Packages
Projects
Releases
Wiki
Activity
39962c79ca
tabby
/
crates
/
llama-cpp-bindings
History
Maciej
ebbe6e5af8
fix: helpful message when llama.cpp submodule is not present (
#719
) (
#775
)
2023-11-13 07:51:46 +00:00
..
include
feat: add --parallelism to control throughput and vram usage (
#727
)
2023-11-08 18:31:22 +00:00
llama.cpp
@
efbd009d2f
feat: sync llama.cpp to latest
2023-11-08 16:06:09 -08:00
src
feat: add LLAMA_CPP_N_THREADS env (
#742
)
2023-11-09 19:54:23 +00:00
Cargo.toml
fix: when there's an error happens in background inference loop, it should exit the process (
#713
)
2023-11-06 20:41:49 +00:00
build.rs
fix: helpful message when llama.cpp submodule is not present (
#719
) (
#775
)
2023-11-13 07:51:46 +00:00