tabby/crates
Meng Zhang 9c905e4849
feat: add rocm support (#913)
* Added build configurations for Intel and AMD hardware

* Improved rocm build

* Added options for OneAPI and ROCm

* Build llama using icx

* [autofix.ci] apply automated fixes

* Fixed rocm image

* Build ROCm

* Tried to adjust compile flags for SYCL

* Removed references to oneAPI

* Provide info about the used device for ROCm

* Added ROCm documentation

* Addressed review comments

* Refactored to expose generic accelerator information

* Pull request cleanup

* cleanup

* cleanup

* Delete .github/workflows/docker-cuda.yml

* Delete .github/workflows/docker-rocm.yml

* Delete crates/tabby-common/src/api/accelerator.rs

* update

* cleanup

* update

* update

* update

* update

---------

Co-authored-by: Cromefire_ <cromefire+git@pm.me>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2023-11-29 03:27:03 +00:00
..
http-api-bindings Release 0.7.0-dev 2023-11-27 14:58:58 +08:00
juniper-axum refactor: use tarpc for easier worker <-> hub communication (#781) 2023-11-14 12:48:20 -08:00
llama-cpp-bindings feat: add rocm support (#913) 2023-11-29 03:27:03 +00:00
tabby feat: add rocm support (#913) 2023-11-29 03:27:03 +00:00
tabby-common Release 0.7.0-dev 2023-11-27 14:58:58 +08:00
tabby-download Release 0.7.0-dev 2023-11-27 14:58:58 +08:00
tabby-inference refactor: handle max output length in StopCondition (#910) 2023-11-28 16:57:16 +08:00
tabby-scheduler Release 0.7.0-dev 2023-11-27 14:58:58 +08:00