Go to file
Meng Zhang 1b7e657afd docs: adjust self-hosting section 2023-09-04 14:23:46 +08:00
.github feat: add cargo test to github actions and run only unit tests in ci [TAB-185] (#390) 2023-09-03 05:04:52 +00:00
ci Revert "chore: add RELEASE_NOTES.md" 2023-08-31 10:11:29 +08:00
clients feat(agent): added adaptive completion debouncing. (#389) 2023-09-01 13:30:53 +08:00
crates test: use function call style snippet for prompt builder unit test (#395) 2023-09-04 04:54:18 +00:00
experimental
python/tabby
tests
website docs: adjust self-hosting section 2023-09-04 14:23:46 +08:00
.dockerignore
.gitattributes
.gitignore
.gitmodules refactor: use TabbyML/llama.cpp submodule 2023-09-03 12:38:54 +08:00
.rustfmt.toml
Cargo.lock feat: add cargo test to github actions and run only unit tests in ci [TAB-185] (#390) 2023-09-03 05:04:52 +00:00
Cargo.toml feat: llama.cpp for metal support [TAB-146] (#391) 2023-09-03 09:59:07 +08:00
Dockerfile
LICENSE
Makefile
README.md docs: update README.md 2023-08-31 10:31:54 +08:00
package.json
yarn.lock

README.md

🐾 Tabby

build status Docker pulls License Slack Community

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open in Playground

Demo

👀 What's New

👋 Get Started

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model TabbyML/SantaCoder-1B --device cuda

For additional options (e.g inference type, parallelism), please refer to the documentation at https://tabbyml.github.io/tabby.

🌟 Star History

Star History Chart