Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

# ![@danielhanchen Avatar](https://lunarcrush.com/gi/w:26/cr:reddit::t2_5wukhd4.png) @danielhanchen danielhanchen

danielhanchen posts on Reddit about vram, k2 the most. They currently have undefined followers and XXX posts still getting attention that total XXX engagements in the last XX hours.

### Engagements: XXX [#](/creator/reddit::t2_5wukhd4/interactions)
![Engagements Line Chart](https://lunarcrush.com/gi/w:600/cr:reddit::t2_5wukhd4/c:line/m:interactions.svg)

- X Week XXXXX +229%
- X Month XXXXXX +115%
- X Months XXXXXX -XX%
- X Year XXXXXX +139%

### Mentions: X [#](/creator/reddit::t2_5wukhd4/posts_active)
![Mentions Line Chart](https://lunarcrush.com/gi/w:600/cr:reddit::t2_5wukhd4/c:line/m:posts_active.svg)

- X Months XX -XX%
- X Year XX +85%

### Followers: undefined [#](/creator/reddit::t2_5wukhd4/followers)
![Followers Line Chart](https://lunarcrush.com/gi/w:600/cr:reddit::t2_5wukhd4/c:line/m:followers.svg)

- X Months XXXXXX +19%
- X Year XXXXXX +139%

### CreatorRank: XXXXXXXXX [#](/creator/reddit::t2_5wukhd4/influencer_rank)
![CreatorRank Line Chart](https://lunarcrush.com/gi/w:600/cr:reddit::t2_5wukhd4/c:line/m:influencer_rank.svg)

### Social Influence

**Social topic influence**
[vram](/topic/vram) #53, [k2](/topic/k2)
### Top Social Posts
Top posts by engagements in the last XX hours

"P GRPO fits in 8GB VRAM - DeepSeek R1's Zero's recipe"  
[Reddit Link](https://redd.it/1ik3nkr)  2025-02-07T19:40Z X followers, 2043 engagements


"Gemma 3n Fine-tuning now in Unsloth - 1.5x faster with XX% less VRAM + Fixes"  
[Reddit Link](https://redd.it/1lp5nhy)  2025-07-01T17:02Z X followers, 1788 engagements


"Train your own Reasoning model - XX% less VRAM - GRPO now in Unsloth (7GB VRAM min.)"  
[Reddit Link](https://redd.it/1ijab77)  2025-02-06T19:07Z X followers, 7790 engagements


"Unsloth October Release"  
[Reddit Link](https://redd.it/1ohqthr)  2025-10-27T21:17Z X followers, XXX engagements


"You can now train LLMs 3x faster with XX% less memory (3.9GB VRAM)"  
[Reddit Link](https://redd.it/1pj51tu)  2025-12-10T15:17Z X followers, 5210 engagements


"You can now run DeepSeek-R1-0528 on your local device (20GB RAM min.)"  
[Reddit Link](https://redd.it/1kz6qku)  2025-05-30T15:13Z X followers, 2778 engagements


"Kimi K2 Thinking 1-bit Unsloth Dynamic GGUFs"  
[Reddit Link](https://redd.it/1ortopy)  2025-11-08T16:30Z X followers, 3684 engagements


"You can now do 500K context length fine-tuning - 6.4x longer"  
[Reddit Link](https://redd.it/1pbh87f)  2025-12-01T16:29Z X followers, 2049 engagements

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@danielhanchen Avatar @danielhanchen danielhanchen

danielhanchen posts on Reddit about vram, k2 the most. They currently have undefined followers and XXX posts still getting attention that total XXX engagements in the last XX hours.

Engagements: XXX #

Engagements Line Chart

  • X Week XXXXX +229%
  • X Month XXXXXX +115%
  • X Months XXXXXX -XX%
  • X Year XXXXXX +139%

Mentions: X #

Mentions Line Chart

  • X Months XX -XX%
  • X Year XX +85%

Followers: undefined #

Followers Line Chart

  • X Months XXXXXX +19%
  • X Year XXXXXX +139%

CreatorRank: XXXXXXXXX #

CreatorRank Line Chart

Social Influence

Social topic influence vram #53, k2

Top Social Posts

Top posts by engagements in the last XX hours

"P GRPO fits in 8GB VRAM - DeepSeek R1's Zero's recipe"
Reddit Link 2025-02-07T19:40Z X followers, 2043 engagements

"Gemma 3n Fine-tuning now in Unsloth - 1.5x faster with XX% less VRAM + Fixes"
Reddit Link 2025-07-01T17:02Z X followers, 1788 engagements

"Train your own Reasoning model - XX% less VRAM - GRPO now in Unsloth (7GB VRAM min.)"
Reddit Link 2025-02-06T19:07Z X followers, 7790 engagements

"Unsloth October Release"
Reddit Link 2025-10-27T21:17Z X followers, XXX engagements

"You can now train LLMs 3x faster with XX% less memory (3.9GB VRAM)"
Reddit Link 2025-12-10T15:17Z X followers, 5210 engagements

"You can now run DeepSeek-R1-0528 on your local device (20GB RAM min.)"
Reddit Link 2025-05-30T15:13Z X followers, 2778 engagements

"Kimi K2 Thinking 1-bit Unsloth Dynamic GGUFs"
Reddit Link 2025-11-08T16:30Z X followers, 3684 engagements

"You can now do 500K context length fine-tuning - 6.4x longer"
Reddit Link 2025-12-01T16:29Z X followers, 2049 engagements

@danielhanchen
/creator/reddit::danielhanchen