Go to file
Meng Zhang f4442b104f docs: usage string for scheduler 2023-06-05 12:57:18 -07:00
.github chore: refactor ci (#196) 2023-06-05 11:30:52 -07:00
clients fix: neovim expand path (#188) 2023-06-04 21:42:18 -07:00
crates docs: usage string for scheduler 2023-06-05 12:57:18 -07:00
docs Update Events.md 2023-05-10 20:23:43 -07:00
tabby Remove FLAGS_enable_meilisearch and FLAGS_rewrite_prompt_with_search_snippet (#122) 2023-05-01 15:06:06 +08:00
tests test: support TABBY_API_HOST in k6 tests 2023-04-04 11:14:22 +08:00
.dockerignore refactor: cleanup unused dotfiles (#198) 2023-06-05 12:54:40 -07:00
.gitattributes Add docker compose (#3) 2023-03-22 02:42:47 +08:00
.gitignore refactor: cleanup unused dotfiles (#198) 2023-06-05 12:54:40 -07:00
.gitmodules add ctranslate2-bindings / tabby rust packages (#146) 2023-05-25 14:05:28 -07:00
Cargo.lock refactor: extract download into tabby-download (#195) 2023-06-05 18:40:24 +00:00
Cargo.toml refactor: extract download into tabby-download (#195) 2023-06-05 18:40:24 +00:00
Dockerfile chore: refactor ci (#196) 2023-06-05 11:30:52 -07:00
LICENSE Create LICENSE 2023-03-16 17:28:10 +08:00
Makefile refactor: cleanup unused dotfiles (#198) 2023-06-05 12:54:40 -07:00
README.md Update README.md 2023-06-05 01:28:08 -07:00

README.md

🐾 Tabby

License Code style: black Docker build status Docker pulls

Self-hosted AI coding assistant. An opensource / on-prem alternative to GitHub Copilot.

Warning Tabby is still in the alpha phase

Features

  • Self-contained, with no need for a DBMS or cloud service
  • Web UI for visualizing and configuration models and MLOps.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Consumer level GPU supports (FP-16 weight loading with various optimization).

Demo

Open in Spaces

Demo

Get started: Server

Docker

We recommend adding the following aliases to your .bashrc or .zshrc file:

# Save aliases to bashrc / zshrc
alias tabby="docker run -u $(id -u) -p 8080:8080 -v $HOME/.tabby:/data tabbyml/tabby"

# Alias for GPU (requires NVIDIA Container Toolkit)
alias tabby-gpu="docker run --gpus all -u $(id -u) -p 8080:8080 -v $HOME/.tabby:/data tabbyml/tabby"

After adding these aliases, you can use the tabby command as usual. Here are some examples of its usage:

# Usage
tabby --help

# Serve the model
tabby serve --model TabbyML/J-350M

Getting Started: Client

We offer multiple methods to connect to Tabby Server, including using OpenAPI and editor extensions.

API

Tabby has opened a FastAPI server at localhost:8080, which includes an OpenAPI documentation of the HTTP API. The same API documentation is also hosted at https://tabbyml.github.io/tabby

Editor Extensions