|
include
|
feat: swtich cpu backend to llama.cpp (#638)
|
2023-10-25 15:40:11 -07:00 |
|
llama.cpp@5cc49e631f
|
feat: upgrade llama.cpp (#645)
|
2023-10-27 12:18:46 -07:00 |
|
src
|
feat: upgrade llama.cpp (#645)
|
2023-10-27 12:18:46 -07:00 |
|
Cargo.toml
|
feat: switch cuda backend to llama.cpp (#656)
|
2023-10-27 13:41:22 -07:00 |
|
build.rs
|
fix: docker build for llama cuda backend
|
2023-10-27 16:36:54 -07:00 |