Go to file
Meng Zhang 3a9b4d9ef5
feat: add graphql interface to tabby-webserver (#770)
feat: add graphql interface to tabby-webserver
2023-11-12 14:52:28 -08:00
.github chore(ci): update cis for tabby-webserver 2023-11-12 13:03:39 -08:00
ci
clients feat(agent): add experimental postprocess limit scope by syntax. (#758) 2023-11-12 10:11:31 -08:00
contrib/jetson
crates feat: add graphql interface to tabby-webserver (#770) 2023-11-12 14:52:28 -08:00
ee feat: add graphql interface to tabby-webserver (#770) 2023-11-12 14:52:28 -08:00
experimental
python/tabby
tests chore: update loadtest based on 0.5.0-rc.4 2023-11-03 17:59:55 -07:00
website docs: fix and add additional information in the Modal installation page (#748) 2023-11-10 17:45:26 -08:00
.dockerignore
.gitattributes docs: Update .gitattributes 2023-11-07 18:57:55 -08:00
.gitignore
.gitmodules refactor: use llama.cpp tokenizer (#683) 2023-10-31 22:16:09 +00:00
.rustfmt.toml
CHANGELOG.md feat(scheduler): add a tqdm bar for scheduler job to better present the remaining time. (#754) 2023-11-10 19:52:07 +00:00
Cargo.lock feat: add graphql interface to tabby-webserver (#770) 2023-11-12 14:52:28 -08:00
Cargo.toml feat: add graphql interface to tabby-webserver (#770) 2023-11-12 14:52:28 -08:00
Dockerfile
LICENSE
MODEL_SPEC.md docs: fix warning sign in MODEL_SPEC.md 2023-11-06 11:22:43 -08:00
Makefile chore: switch cargo clippy to nightly 2023-11-12 13:49:42 -08:00
README.md docs Update README.md with v0.5.5 2023-11-09 00:51:50 -08:00
package.json
yarn.lock feat(agent): add experimental postprocess limit scope by syntax. (#758) 2023-11-12 10:11:31 -08:00

README.md

🐾 Tabby

latest release PRs Welcome Docker pulls

Slack Community Office Hours

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open in Playground

Demo

🔥 What's New

  • 11/09/2023 v0.5.5 released! With a redesign of UI + performance improvement.
  • 10/24/2023 Major updates for Tabby IDE plugins across VSCode/Vim/IntelliJ!
  • 10/15/2023 RAG-based code completion is enabled by detail in v0.3.0🎉! Check out the blogpost explaining how Tabby utilizes repo-level context to get even smarter!
Archived
  • 10/04/2023 Check out the model directory for the latest models supported by Tabby.
  • 09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1!
  • 08/31/2023 Tabby's first stable release v0.0.1 🥳.
  • 08/28/2023 Experimental support for the CodeLlama 7B.
  • 08/24/2023 Tabby is now on JetBrains Marketplace!

👋 Getting Started

You can find our documentation here.

Run Tabby in 1 Minute

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model TabbyML/SantaCoder-1B --device cuda

For additional options (e.g inference type, parallelism), please refer to the documentation page.

🤝 Contributing

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt-get install protobuf-compiler libopenblas-dev
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌍 Community

  • #️⃣ Slack - connect with the TabbyML community
  • 🎤 Twitter / X - engage with TabbyML for all things possible
  • 📚 LinkedIn - follow for the latest from the community
  • 💌 Newsletter - subscribe to unlock Tabby insights and secrets

🌟 Star History

Star History Chart