# π Getting Started
Tabby is an open-source, self-hosted AI coding assistant. With Tabby, every team can set up its own LLM-powered code completion server with ease.
[](https://join.slack.com/t/tabbycommunity/shared_invite/zt-1xeiddizp-bciR2RtFTaJ37RBxr8VxpA)
[](https://www.linkedin.com/company/tabbyml/)
[](https://github.com/TabbyML/tabby)
## π Principles
* **Open**: Tabby is free, open-source, and compatible with major Coding LLMs (CodeLlama, StarCoder, CodeGen). In fact, you can use and combine your preferred models without implementing anything by yourself.
* **End-to-End**: While most coding tools consider code completion merely as a thin wrapper atop Coding LLMs, in real-world scenarios, optimizations in IDE extensions can be just as crucial as the capabilities of Coding LLMs.
Tabby optimizes the entire serving stack:
+ IDE extensions: Tabby achieves accurate streaming and cancellation with an adaptive caching strategy to ensure rapid completion (in less than a second).
+ Model serving: Tabby parses relevant code into Tree Sitter tags to provide effective prompts.
* **User and Developer Experience**: The key to sustainable open-source solutions is to make it easier for everyone to contribute to projects.
AI experts should feel comfortable understanding and improving the suggestion quality.
EngOps team should find it easy to set up and feel in control of the data.
Developers should have an "aha" moment during coding time.
Tabby optimizes the experience for these core users to enhance your team's productivity.
## ποΈ Community
β Tabby [Github repo](https://github.com/TabbyML/tabby) to stay updated about new releases and tutorials.
π Join the Tabby community on [Slack](https://join.slack.com/t/tabbycommunity/shared_invite/zt-1xeiddizp-bciR2RtFTaJ37RBxr8VxpA) and get direct support from the community.
## πΊοΈ Roadmap
We continuously work on updating [our roadmap](./roadmap) and we love to discuss those with our community. Feel encouraged to participate.