Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

![akshay_pachaar Avatar](https://lunarcrush.com/gi/w:24/cr:twitter::703601972.png) Akshay ๐Ÿš€ [@akshay_pachaar](/creator/twitter/akshay_pachaar) on x 215.3K followers
Created: 2025-05-29 13:03:31 UTC

To understand KV caching, we must know how LLMs output tokens.

- Transformer produces hidden states for all tokens.
- Hidden states are projected to vocab space.
- Logits of the last token is used to generate the next token.
- Repeat for subsequent tokens.

Check this๐Ÿ‘‡

![](https://pbs.twimg.com/tweet_video_thumb/GsHlvi-aUAEmhGj.jpg)

XXXXXX engagements

![Engagements Line Chart](https://lunarcrush.com/gi/w:600/p:tweet::1928074728063021378/c:line.svg)

**Related Topics**
[token](/topic/token)

[Post Link](https://x.com/akshay_pachaar/status/1928074728063021378)

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

akshay_pachaar Avatar Akshay ๐Ÿš€ @akshay_pachaar on x 215.3K followers Created: 2025-05-29 13:03:31 UTC

To understand KV caching, we must know how LLMs output tokens.

  • Transformer produces hidden states for all tokens.
  • Hidden states are projected to vocab space.
  • Logits of the last token is used to generate the next token.
  • Repeat for subsequent tokens.

Check this๐Ÿ‘‡

XXXXXX engagements

Engagements Line Chart

Related Topics token

Post Link

post/tweet::1928074728063021378
/post/tweet::1928074728063021378