Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

# ![@support_huihui Avatar](https://lunarcrush.com/gi/w:26/cr:twitter::1885583326565851137.png) @support_huihui huihui.ai

huihui.ai posts on X about ollama, just a, business, if you the most. They currently have XXXXX followers and XXX posts still getting attention that total XXX engagements in the last XX hours.

### Engagements: XXX [#](/creator/twitter::1885583326565851137/interactions)
![Engagements Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::1885583326565851137/c:line/m:interactions.svg)

- X Week XXXXX +62%
- X Month XXXXX +42%
- X Months XXXXXX +908%

### Mentions: X [#](/creator/twitter::1885583326565851137/posts_active)
![Mentions Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::1885583326565851137/c:line/m:posts_active.svg)

- X Week XX no change
- X Month XX +37%
- X Months XXX +850%

### Followers: XXXXX [#](/creator/twitter::1885583326565851137/followers)
![Followers Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::1885583326565851137/c:line/m:followers.svg)

- X Week XXXXX +8.90%
- X Month XXXXX +15%
- X Months XXXXX +145%

### CreatorRank: XXXXXXXXX [#](/creator/twitter::1885583326565851137/influencer_rank)
![CreatorRank Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::1885583326565851137/c:line/m:influencer_rank.svg)

### Social Influence

**Social topic influence**
[ollama](/topic/ollama) #23, [just a](/topic/just-a), [business](/topic/business), [if you](/topic/if-you)
### Top Social Posts
Top posts by engagements in the last XX hours

"@H8KUcom We are preparing to upload the file ggml-model-Q4_K_M.gguf from Huihui-GLM-4.6-abliterated again which has a size of 210GB to see if it can be uploaded successfully"  
[X Link](https://x.com/support_huihui/status/1997267532919844941)  2025-12-06T11:31Z 1081 followers, XXX engagements


"We plan to upload huihui_ai/glm4.6-abliterated:357b-q4_K_M to Ollama first. Fortunately Ollama currently has no storage restrictions. We've been hesitating about whether to upload the q4_K_M version because Ollama's GLM-4.6 is now only available in the cloud version and we estimate that it's probably just a Q4-series quantization. We don't want to negatively impact Ollama's business"  
[X Link](https://x.com/support_huihui/status/1997272322823143896)  2025-12-06T11:50Z 1082 followers, XXX engagements


"Ollama: ollama run huihui_ai/glm4.6-abliterated ollama run huihui_ai/glm4.6-abliterated refers to ollama run huihui_ai/glm4.6-abliterated:357b-q4_K_M"  
[X Link](https://x.com/support_huihui/status/1997396418647191649)  2025-12-06T20:03Z 1082 followers, XXX engagements


"In order to save space on we will gradually remove the GGUF files from the earliest models. If you have a particular model that you would like to keep you can download it in advance; no further notice will be given"  
[X Link](https://x.com/support_huihui/status/1997264533401035191)  2025-12-06T11:19Z 1085 followers, XXX engagements


"New Model: huihui-ai/Huihui-GLM-4.6V-Flash-abliterated This is an uncensored version of zai-org/GLM-4.6V-Flash created with abliteration. It was only the text part that was processed not the image part"  
[X Link](https://x.com/support_huihui/status/1998452375003213913)  2025-12-09T17:59Z 1085 followers, XXX engagements


"New GGUF huihui-ai/Huihui-GLM-4.6-abliterated-GGUF/Q4_K_M-GGUF"  
[X Link](https://x.com/support_huihui/status/1997350898897699174)  2025-12-06T17:02Z 1085 followers, XXX engagements


"Ollama: ollama run huihui_ai/qwen3-next-abliterated:80b-a3b-instruct"  
[X Link](https://x.com/support_huihui/status/1998594827471262038)  2025-12-10T03:25Z 1086 followers, XXX engagements


"Ollama: ollama run huihui_ai/qwen3-next-abliterated or ollama run huihui_ai/qwen3-next-abliterated:80b-a3b-thinking"  
[X Link](https://x.com/support_huihui/status/1998862623128039926)  2025-12-10T21:09Z 1085 followers, XXX engagements


"Ollama: ollama run huihui_ai/kimi-k2-abliterated or ollama run huihui_ai/kimi-k2-abliterated:1026b-instruct-0905-Q2_K"  
[X Link](https://x.com/support_huihui/status/1999030205525172643)  2025-12-11T08:15Z 1086 followers, XXX engagements

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@support_huihui Avatar @support_huihui huihui.ai

huihui.ai posts on X about ollama, just a, business, if you the most. They currently have XXXXX followers and XXX posts still getting attention that total XXX engagements in the last XX hours.

Engagements: XXX #

Engagements Line Chart

  • X Week XXXXX +62%
  • X Month XXXXX +42%
  • X Months XXXXXX +908%

Mentions: X #

Mentions Line Chart

  • X Week XX no change
  • X Month XX +37%
  • X Months XXX +850%

Followers: XXXXX #

Followers Line Chart

  • X Week XXXXX +8.90%
  • X Month XXXXX +15%
  • X Months XXXXX +145%

CreatorRank: XXXXXXXXX #

CreatorRank Line Chart

Social Influence

Social topic influence ollama #23, just a, business, if you

Top Social Posts

Top posts by engagements in the last XX hours

"@H8KUcom We are preparing to upload the file ggml-model-Q4_K_M.gguf from Huihui-GLM-4.6-abliterated again which has a size of 210GB to see if it can be uploaded successfully"
X Link 2025-12-06T11:31Z 1081 followers, XXX engagements

"We plan to upload huihui_ai/glm4.6-abliterated:357b-q4_K_M to Ollama first. Fortunately Ollama currently has no storage restrictions. We've been hesitating about whether to upload the q4_K_M version because Ollama's GLM-4.6 is now only available in the cloud version and we estimate that it's probably just a Q4-series quantization. We don't want to negatively impact Ollama's business"
X Link 2025-12-06T11:50Z 1082 followers, XXX engagements

"Ollama: ollama run huihui_ai/glm4.6-abliterated ollama run huihui_ai/glm4.6-abliterated refers to ollama run huihui_ai/glm4.6-abliterated:357b-q4_K_M"
X Link 2025-12-06T20:03Z 1082 followers, XXX engagements

"In order to save space on we will gradually remove the GGUF files from the earliest models. If you have a particular model that you would like to keep you can download it in advance; no further notice will be given"
X Link 2025-12-06T11:19Z 1085 followers, XXX engagements

"New Model: huihui-ai/Huihui-GLM-4.6V-Flash-abliterated This is an uncensored version of zai-org/GLM-4.6V-Flash created with abliteration. It was only the text part that was processed not the image part"
X Link 2025-12-09T17:59Z 1085 followers, XXX engagements

"New GGUF huihui-ai/Huihui-GLM-4.6-abliterated-GGUF/Q4_K_M-GGUF"
X Link 2025-12-06T17:02Z 1085 followers, XXX engagements

"Ollama: ollama run huihui_ai/qwen3-next-abliterated:80b-a3b-instruct"
X Link 2025-12-10T03:25Z 1086 followers, XXX engagements

"Ollama: ollama run huihui_ai/qwen3-next-abliterated or ollama run huihui_ai/qwen3-next-abliterated:80b-a3b-thinking"
X Link 2025-12-10T21:09Z 1085 followers, XXX engagements

"Ollama: ollama run huihui_ai/kimi-k2-abliterated or ollama run huihui_ai/kimi-k2-abliterated:1026b-instruct-0905-Q2_K"
X Link 2025-12-11T08:15Z 1086 followers, XXX engagements

creator/x::support_huihui
/creator/x::support_huihui