[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.] #  @diptalksdeep deep.in deep.in posts on X about open ai, ollama, backend the most. They currently have X followers and X posts still getting attention that total X engagements in the last XX hours. ### Engagements: X [#](/creator/twitter::1923318680307499008/interactions)  - X Month XX +117% ### Mentions: X [#](/creator/twitter::1923318680307499008/posts_active)  - X Month X +150% ### Followers: X [#](/creator/twitter::1923318680307499008/followers)  - X Month X +50% ### CreatorRank: undefined [#](/creator/twitter::1923318680307499008/influencer_rank)  ### Social Influence [#](/creator/twitter::1923318680307499008/influence) --- **Social category influence** [technology brands](/list/technology-brands) XX% **Social topic influence** [open ai](/topic/open-ai) 25%, [ollama](/topic/ollama) 25%, [backend](/topic/backend) XX% ### Top Social Posts [#](/creator/twitter::1923318680307499008/posts) --- Top posts by engagements in the last XX hours "LLMs are overkill for most projects. Built a local SLM API server using FastAPI + Ollama runs fully offline streams like OpenAI costs X. Feels like a real backend not a demo. Open-sourcing soon" [X Link](https://x.com/diptalksdeep/status/1981307862971478253) [@diptalksdeep](/creator/x/diptalksdeep) 2025-10-23T10:33Z X followers, XX engagements
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
@diptalksdeep deep.indeep.in posts on X about open ai, ollama, backend the most. They currently have X followers and X posts still getting attention that total X engagements in the last XX hours.
Social category influence technology brands XX%
Social topic influence open ai 25%, ollama 25%, backend XX%
Top posts by engagements in the last XX hours
"LLMs are overkill for most projects. Built a local SLM API server using FastAPI + Ollama runs fully offline streams like OpenAI costs X. Feels like a real backend not a demo. Open-sourcing soon"
X Link @diptalksdeep 2025-10-23T10:33Z X followers, XX engagements
/creator/twitter::diptalksdeep