From 2133f8f1877c2aac169b0933a9e295ec81b51455 Mon Sep 17 00:00:00 2001 From: Meng Zhang Date: Wed, 22 Nov 2023 09:32:45 +0800 Subject: [PATCH] docs: set the quick start example to be StarCoder-1B --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 4947965..0626091 100644 --- a/README.md +++ b/README.md @@ -55,7 +55,7 @@ The easiest way to start a Tabby server is by using the following Docker command docker run -it \ --gpus all -p 8080:8080 -v $HOME/.tabby:/data \ tabbyml/tabby \ - serve --model TabbyML/SantaCoder-1B --device cuda + serve --model TabbyML/StarCoder-1B --device cuda ``` For additional options (e.g inference type, parallelism), please refer to the [documentation page](https://tabbyml.github.io/tabby).