[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.] #  @ursisterbtw ursister ursister posts on X about cli, ollama, claude the most. They currently have XXXXX followers and XXX posts still getting attention that total XXX engagements in the last XX hours. ### Engagements: XXX [#](/creator/twitter::1418563263420411910/interactions)  - X Week XXX +1,333% - X Month XXXXX +2,713% - X Months XXXXX +71% - X Year XXXXX -XX% ### Mentions: X [#](/creator/twitter::1418563263420411910/posts_active)  ### Followers: XXXXX [#](/creator/twitter::1418563263420411910/followers)  - X Week XXXXX +0.13% - X Month XXXXX no change - X Months XXXXX -XXXX% - X Year XXXXX -XXXX% ### CreatorRank: XXXXXXXXX [#](/creator/twitter::1418563263420411910/influencer_rank)  ### Social Influence [#](/creator/twitter::1418563263420411910/influence) --- **Social topic influence** [cli](/topic/cli), [ollama](/topic/ollama), [claude](/topic/claude) ### Top Social Posts [#](/creator/twitter::1418563263420411910/posts) --- Top posts by engagements in the last XX hours "@0xcheddie I havent used llms INSIDE claude code but Im fairly sure theres a way to do it with env vars - however I really like devstral in ollama or lm studio for local u can wire those into opencode cli if you want a terminal experience similar to cc using local models"  [@ursisterbtw](/creator/x/ursisterbtw) on [X](/post/tweet/1949423673787449761) 2025-07-27 10:56:37 UTC 7474 followers, XX engagements
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
ursister posts on X about cli, ollama, claude the most. They currently have XXXXX followers and XXX posts still getting attention that total XXX engagements in the last XX hours.
Social topic influence cli, ollama, claude
Top posts by engagements in the last XX hours
"@0xcheddie I havent used llms INSIDE claude code but Im fairly sure theres a way to do it with env vars - however I really like devstral in ollama or lm studio for local u can wire those into opencode cli if you want a terminal experience similar to cc using local models" @ursisterbtw on X 2025-07-27 10:56:37 UTC 7474 followers, XX engagements
/creator/x::ursisterbtw