Go to file
Zhiming Ma 23d8614e56 chore(intellij): hotfix intellij plugin version 1.1.2. 2023-12-07 11:57:39 +08:00
.github fix: set tests rust toolchain to 1.73.0 2023-11-19 16:10:37 -08:00
ci
clients chore(intellij): hotfix intellij plugin version 1.1.2. 2023-12-07 11:57:39 +08:00
crates refactor: extract run_app function (#843) 2023-11-20 01:00:35 +00:00
ee refactor: extract run_app function (#843) 2023-11-20 01:00:35 +00:00
experimental
python/tabby
tests chore: update loadtest based on 0.5.0-rc.4 2023-11-03 17:59:55 -07:00
website docs: polish faq part for custom model 2023-11-15 15:00:42 -08:00
.dockerignore
.gitattributes chore: exclude ee/tabby-webserver/ui to language stats 2023-11-17 15:50:27 -08:00
.gitignore
.gitmodules refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
.rustfmt.toml
CHANGELOG.md docs: add date to previous releases 2023-11-19 17:13:38 -08:00
Cargo.lock feat: add Prometheus support to Tabby. (#838) 2023-11-19 15:40:57 -08:00
Cargo.toml refactor: extract routes/ to share routes between commands (#774) 2023-11-13 06:24:20 +00:00
Dockerfile fix: RUST_TOOLCHAIN need to be inside of from block (#840) 2023-11-19 14:08:57 -08:00
LICENSE docs: remove boilerplate section 2023-10-29 16:21:53 -07:00
MODEL_SPEC.md docs: remove 0.5.0 warning as tabby adapt same format for remote / local storage 2023-11-17 11:43:16 -08:00
Makefile refactor: extract run_app function (#843) 2023-11-20 01:00:35 +00:00
README.md docs Update README.md with v0.5.5 2023-11-09 00:51:50 -08:00
package.json
yarn.lock feat(agent): add experimental postprocess limit scope by syntax. (#758) 2023-11-12 10:11:31 -08:00

README.md

🐾 Tabby

latest release PRs Welcome Docker pulls

Slack Community Office Hours

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open in Playground

Demo

🔥 What's New

  • 11/09/2023 v0.5.5 released! With a redesign of UI + performance improvement.
  • 10/24/2023 Major updates for Tabby IDE plugins across VSCode/Vim/IntelliJ!
  • 10/15/2023 RAG-based code completion is enabled by detail in v0.3.0🎉! Check out the blogpost explaining how Tabby utilizes repo-level context to get even smarter!
Archived
  • 10/04/2023 Check out the model directory for the latest models supported by Tabby.
  • 09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1!
  • 08/31/2023 Tabby's first stable release v0.0.1 🥳.
  • 08/28/2023 Experimental support for the CodeLlama 7B.
  • 08/24/2023 Tabby is now on JetBrains Marketplace!

👋 Getting Started

You can find our documentation here.

Run Tabby in 1 Minute

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model TabbyML/SantaCoder-1B --device cuda

For additional options (e.g inference type, parallelism), please refer to the documentation page.

🤝 Contributing

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt-get install protobuf-compiler libopenblas-dev
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌍 Community

  • #️⃣ Slack - connect with the TabbyML community
  • 🎤 Twitter / X - engage with TabbyML for all things possible
  • 📚 LinkedIn - follow for the latest from the community
  • 💌 Newsletter - subscribe to unlock Tabby insights and secrets

🌟 Star History

Star History Chart