docs: add modelscope registry information to model directory
parent
3600d9e399
commit
0a949aa6b6
|
|
@ -4,7 +4,7 @@ sidebar_position: 4
|
||||||
|
|
||||||
# 🧑🔬 Models Directory
|
# 🧑🔬 Models Directory
|
||||||
|
|
||||||
## Completion models (For `--model`)
|
## Completion models (`--model`)
|
||||||
We recommend using
|
We recommend using
|
||||||
* **small models (less than 400M)** for **CPU devices**.
|
* **small models (less than 400M)** for **CPU devices**.
|
||||||
* For **1B to 7B models**, it's advisable to have at least **NVIDIA T4, 10 Series, or 20 Series GPUs**.
|
* For **1B to 7B models**, it's advisable to have at least **NVIDIA T4, 10 Series, or 20 Series GPUs**.
|
||||||
|
|
@ -19,7 +19,7 @@ We recommend using
|
||||||
| [TabbyML/StarCoder-1B](https://huggingface.co/TabbyML/StarCoder-1B) | [BigCode-OpenRAIL-M](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) | ✅ | ✅ |
|
| [TabbyML/StarCoder-1B](https://huggingface.co/TabbyML/StarCoder-1B) | [BigCode-OpenRAIL-M](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) | ✅ | ✅ |
|
||||||
| [TabbyML/J-350M](https://huggingface.co/TabbyML/J-350M) | [BSD-3](https://opensource.org/license/bsd-3-clause/) | ❌ | ❌ |
|
| [TabbyML/J-350M](https://huggingface.co/TabbyML/J-350M) | [BSD-3](https://opensource.org/license/bsd-3-clause/) | ❌ | ❌ |
|
||||||
|
|
||||||
## Chat models (For `--chat-model`)
|
## Chat models (`--chat-model`)
|
||||||
|
|
||||||
To ensure optimal response quality, and given that latency requirements are not stringent in this scenario, we recommend using a model with at least 3B parameters.
|
To ensure optimal response quality, and given that latency requirements are not stringent in this scenario, we recommend using a model with at least 3B parameters.
|
||||||
|
|
||||||
|
|
@ -27,3 +27,11 @@ To ensure optimal response quality, and given that latency requirements are not
|
||||||
| ------------------------------------------------------------------------- | :---------------------------------------------------------------------------------: |
|
| ------------------------------------------------------------------------- | :---------------------------------------------------------------------------------: |
|
||||||
| [TabbyML/Mistral-7B](https://huggingface.co/TabbyML/Mistral-7B) | [Apache 2.0](https://opensource.org/licenses/Apache-2.0) |
|
| [TabbyML/Mistral-7B](https://huggingface.co/TabbyML/Mistral-7B) | [Apache 2.0](https://opensource.org/licenses/Apache-2.0) |
|
||||||
| [TabbyML/WizardCoder-3B](https://huggingface.co/TabbyML/WizardCoder-3B) | [OpenRAIL-M](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) |
|
| [TabbyML/WizardCoder-3B](https://huggingface.co/TabbyML/WizardCoder-3B) | [OpenRAIL-M](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) |
|
||||||
|
|
||||||
|
## Alternative Registry
|
||||||
|
|
||||||
|
By default, Tabby utilizes the [Hugging Face organization](https://huggingface.co/TabbyML) as its model registry. Mainland Chinese users have encountered challenges accessing Hugging Face for various reasons. The Tabby team has established a mirrored at [modelscope]([https://www.modelscope.cn](https://www.modelscope.cn/organization/TabbyML)), which can be utilized using the following environment variable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
TABBY_REGISTRY=modelscope tabby serve --model TabbyML/StarCoder-1B
|
||||||
|
```
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue