docs: fix title for the decoding blog
parent
e08a08b566
commit
6e59aa8d81
|
|
@ -5,7 +5,7 @@ tags: [tech design]
|
|||
|
||||
image: ./twitter-decoding.png
|
||||
---
|
||||
# Decoding the Decoding in Tabby
|
||||
# Decode the Decoding in Tabby
|
||||
|
||||
In the context of the Transformer model, which is widely used across LLMs, ***decoding*** refers to the process of generating an output sequence from an encoded input. Tabby recenty [implemented ***incremental decoding***](https://github.com/TabbyML/tabby/pull/491) as part of the greedy search. This blog will explain our thoughts behind this 🛠️💡.
|
||||
|
||||
|
|
@ -69,4 +69,4 @@ Incremental decoding: ......, 207, 211 -> "......[ hello]" ✅
|
|||
|
||||
For interested folks, you can refer to Tabby's exact implementation in `IncrementalDecoding` funcion in [`creates/tabby-inference/src/decoding.rs`](https://github.com/TabbyML/tabby/pull/491).
|
||||
|
||||
Have you found our new decoding methods effective? Share your thoughts with us in our [Slack](https://join.slack.com/t/tabbyml/shared_invite/zt-22thejc0z-7ePKeWNCHPX31pEtnT4oYQ) channel 🌍😊!
|
||||
Have you found our new decoding methods effective? Share your thoughts with us in our [Slack](https://join.slack.com/t/tabbyml/shared_invite/zt-22thejc0z-7ePKeWNCHPX31pEtnT4oYQ) channel 🌍😊!
|
||||
|
|
|
|||
Loading…
Reference in New Issue