Go to file
Lucy Gao 6348018d38
docs: add back-pressure and cancellation blog (#479)
* docs: add back-pressure and cancellation blog

* fix(doc): format and content improvement for the back-pressure blog (#496)

* Minor editorial changes on cancellation blog.

* adjust blog structure

* rename blog title

---------

Co-authored-by: Wang Zixiao <wayne.wang0821@gmail.com>
Co-authored-by: Meng Zhang <meng@tabbyml.com>
2023-09-30 10:47:09 -07:00
.github feat(intellij): update tabby-agent to 0.3.1. (#490) 2023-09-29 03:06:47 -07:00
ci
clients feat: add OpenAPI link to playground 2023-09-29 18:20:38 -07:00
crates refactor: cleanup llama cpp implementations to fix warnings (#495) 2023-09-30 08:37:36 -07:00
experimental fix: make sure modelscope README.md is longer than 200 characters 2023-09-26 21:56:01 -07:00
python/tabby
tests
website docs: add back-pressure and cancellation blog (#479) 2023-09-30 10:47:09 -07:00
.dockerignore
.gitattributes
.gitignore
.gitmodules
.rustfmt.toml
Cargo.lock feat: add tabby playground for q&a use case (#493) 2023-09-29 15:51:54 -07:00
Cargo.toml fix: correct Decoding behavior in incremental manner (#491) 2023-09-29 13:06:47 +00:00
Dockerfile
LICENSE
MODEL_SPEC.md
Makefile feat: add tabby playground for q&a use case (#493) 2023-09-29 15:51:54 -07:00
README.md docs: add archived section in what's new 2023-09-26 12:10:57 -07:00
package.json
yarn.lock Refactor completion request statistics (#474) 2023-09-26 03:01:38 -07:00

README.md

🐾 Tabby

build status Docker pulls License Slack Community

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open in Playground

Demo

🔥 What's New

  • 09/21/2023 We've hit 10K stars 🌟 on GitHub! 🚀🎉👏
  • 09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1!
  • 08/31/2023 Tabby's first stable release v0.0.1 🥳.
Archived

👋 Getting Started

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model TabbyML/SantaCoder-1B --device cuda

For additional options (e.g inference type, parallelism), please refer to the documentation at https://tabbyml.github.io/tabby.

🤝 Contributing

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt-get install protobuf-compiler libopenblas-dev
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌟 Star History

Star History Chart