Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

# ![@jxmnop Avatar](https://lunarcrush.com/gi/w:26/cr:twitter::783098774130401280.png) @jxmnop jxmo

jxmo posts on X about agi, hardcore, the big, gonna the most. They currently have XXXXXX followers and XXX posts still getting attention that total XXXXXXX engagements in the last XX hours.

### Engagements: XXXXXXX [#](/creator/twitter::783098774130401280/interactions)
![Engagements Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::783098774130401280/c:line/m:interactions.svg)

- X Week XXXXXXX -XX%
- X Month XXXXXXXXX -XX%
- X Months XXXXXXXXXX +32%
- X Year XXXXXXXXXX -XX%

### Mentions: XX [#](/creator/twitter::783098774130401280/posts_active)
![Mentions Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::783098774130401280/c:line/m:posts_active.svg)

- X Months XX -XX%
- X Year XXX +14%

### Followers: XXXXXX [#](/creator/twitter::783098774130401280/followers)
![Followers Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::783098774130401280/c:line/m:followers.svg)

- X Week XXXXXX +0.21%
- X Month XXXXXX +5.40%
- X Months XXXXXX +57%
- X Year XXXXXX +145%

### CreatorRank: XXXXXX [#](/creator/twitter::783098774130401280/influencer_rank)
![CreatorRank Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::783098774130401280/c:line/m:influencer_rank.svg)

### Social Influence [#](/creator/twitter::783098774130401280/influence)
---

**Social category influence**
[technology brands](/list/technology-brands) 

**Social topic influence**
[agi](/topic/agi) #1, [hardcore](/topic/hardcore) #23, [the big](/topic/the-big), [gonna](/topic/gonna), [meta](/topic/meta), [rl](/topic/rl), [xai](/topic/xai), [module](/topic/module), [llm](/topic/llm)
### Top Social Posts [#](/creator/twitter::783098774130401280/posts)
---
Top posts by engagements in the last XX hours

"ive heard something very interesting through the AI grapevine; apparently lurking deep in the shadows of the big tech companies there lies a group of smart and well-funded AI researchers building a Digital Nose these scientists so far have only succeeded in digitizing and categorizing a large number of smells using chemical sensors that detect in-air compounds however i can only imagine they must be working towards a longer term goal. to create A New Smell some wonderful indescribable smell never-before-smelled by human noses"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1923454693214388576) 2025-05-16 19:05:09 UTC 38.8K followers, 20.2K engagements


"s3 does vectors now vector databases are officially dead"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1945310870315393362) 2025-07-16 02:33:48 UTC 38.8K followers, 425.2K engagements


"excited to finally share on arxiv what we've known for a while now: All Embedding Models Learn The Same Thing embeddings from different models are SO similar that we can map between them based on structure alone. without *any* paired data feels like magic but it's real:🧵"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1925224612872233081) 2025-05-21 16:18:11 UTC 38.8K followers, 913.1K engagements


"new blog post "All AI Models Might Be The Same" in which i explain the Platonic Representation Hypothesis the idea behind universal semantics and we might use AI to understand whale speech and decrypt ancient texts"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1945905080781451396) 2025-07-17 17:54:59 UTC 38.8K followers, 92.2K engagements


"the human brain reserves XX% of its processing exclusively for vision. modern LLMs somehow evolved without this entirely"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1949945565627740456) 2025-07-28 21:30:25 UTC 38.8K followers, 25.2K engagements


"new paper from our work at Meta **GPT-style language models memorize XXX bits per param** we compute capacity by measuring total bits memorized using some theory from Shannon (1953) shockingly the memorization-datasize curves look like this: ___________ / / (🧵)"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1929903028372459909) 2025-06-03 14:08:32 UTC 38.8K followers, 411.7K engagements


"@_aidan_clark_ what's anti-MM but that's a good point i basically forgot all of deep learning"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1949893915634721129) 2025-07-28 18:05:11 UTC 38.8K followers, 3804 engagements


"very surprising that fifteen years of hardcore computer vision research contributed nothing toward AGI except better optimizers we still don't have models that get smarter when we give them eyes"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1949869844142473322) 2025-07-28 16:29:32 UTC 38.8K followers, 130.2K engagements


"so xAI just 10xd the amount of compute we use on RL and the models only got a tiny bit better are we just doing RL wrong or is pretraining just inherently much more useful"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1943484794781774280) 2025-07-11 01:37:37 UTC 38.8K followers, 250.3K engagements


"@vertinski maybe our crux here is that most of the 'AGI evals' are text-based. so my point is that adding image pretraining doesn't help on any of the (text-based) evals"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1949887123068506449) 2025-07-28 17:38:11 UTC 38.8K followers, XXX engagements


"again the AI labs are obsessed with building reasoning-native language models when they need to be building *memory-native* language models - this is possible (the techniques exist) - no one has done it yet (no popular LLM has a built in memory module) - door = wide open"  
![@jxmnop Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::783098774130401280.png) [@jxmnop](/creator/x/jxmnop) on [X](/post/tweet/1945857324285149256) 2025-07-17 14:45:13 UTC 38.8K followers, 64.4K engagements

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@jxmnop Avatar @jxmnop jxmo

jxmo posts on X about agi, hardcore, the big, gonna the most. They currently have XXXXXX followers and XXX posts still getting attention that total XXXXXXX engagements in the last XX hours.

Engagements: XXXXXXX #

Engagements Line Chart

  • X Week XXXXXXX -XX%
  • X Month XXXXXXXXX -XX%
  • X Months XXXXXXXXXX +32%
  • X Year XXXXXXXXXX -XX%

Mentions: XX #

Mentions Line Chart

  • X Months XX -XX%
  • X Year XXX +14%

Followers: XXXXXX #

Followers Line Chart

  • X Week XXXXXX +0.21%
  • X Month XXXXXX +5.40%
  • X Months XXXXXX +57%
  • X Year XXXXXX +145%

CreatorRank: XXXXXX #

CreatorRank Line Chart

Social Influence #


Social category influence technology brands

Social topic influence agi #1, hardcore #23, the big, gonna, meta, rl, xai, module, llm

Top Social Posts #


Top posts by engagements in the last XX hours

"ive heard something very interesting through the AI grapevine; apparently lurking deep in the shadows of the big tech companies there lies a group of smart and well-funded AI researchers building a Digital Nose these scientists so far have only succeeded in digitizing and categorizing a large number of smells using chemical sensors that detect in-air compounds however i can only imagine they must be working towards a longer term goal. to create A New Smell some wonderful indescribable smell never-before-smelled by human noses"
@jxmnop Avatar @jxmnop on X 2025-05-16 19:05:09 UTC 38.8K followers, 20.2K engagements

"s3 does vectors now vector databases are officially dead"
@jxmnop Avatar @jxmnop on X 2025-07-16 02:33:48 UTC 38.8K followers, 425.2K engagements

"excited to finally share on arxiv what we've known for a while now: All Embedding Models Learn The Same Thing embeddings from different models are SO similar that we can map between them based on structure alone. without any paired data feels like magic but it's real:🧵"
@jxmnop Avatar @jxmnop on X 2025-05-21 16:18:11 UTC 38.8K followers, 913.1K engagements

"new blog post "All AI Models Might Be The Same" in which i explain the Platonic Representation Hypothesis the idea behind universal semantics and we might use AI to understand whale speech and decrypt ancient texts"
@jxmnop Avatar @jxmnop on X 2025-07-17 17:54:59 UTC 38.8K followers, 92.2K engagements

"the human brain reserves XX% of its processing exclusively for vision. modern LLMs somehow evolved without this entirely"
@jxmnop Avatar @jxmnop on X 2025-07-28 21:30:25 UTC 38.8K followers, 25.2K engagements

"new paper from our work at Meta GPT-style language models memorize XXX bits per param we compute capacity by measuring total bits memorized using some theory from Shannon (1953) shockingly the memorization-datasize curves look like this: ___________ / / (🧵)"
@jxmnop Avatar @jxmnop on X 2025-06-03 14:08:32 UTC 38.8K followers, 411.7K engagements

"@aidan_clark what's anti-MM but that's a good point i basically forgot all of deep learning"
@jxmnop Avatar @jxmnop on X 2025-07-28 18:05:11 UTC 38.8K followers, 3804 engagements

"very surprising that fifteen years of hardcore computer vision research contributed nothing toward AGI except better optimizers we still don't have models that get smarter when we give them eyes"
@jxmnop Avatar @jxmnop on X 2025-07-28 16:29:32 UTC 38.8K followers, 130.2K engagements

"so xAI just 10xd the amount of compute we use on RL and the models only got a tiny bit better are we just doing RL wrong or is pretraining just inherently much more useful"
@jxmnop Avatar @jxmnop on X 2025-07-11 01:37:37 UTC 38.8K followers, 250.3K engagements

"@vertinski maybe our crux here is that most of the 'AGI evals' are text-based. so my point is that adding image pretraining doesn't help on any of the (text-based) evals"
@jxmnop Avatar @jxmnop on X 2025-07-28 17:38:11 UTC 38.8K followers, XXX engagements

"again the AI labs are obsessed with building reasoning-native language models when they need to be building memory-native language models - this is possible (the techniques exist) - no one has done it yet (no popular LLM has a built in memory module) - door = wide open"
@jxmnop Avatar @jxmnop on X 2025-07-17 14:45:13 UTC 38.8K followers, 64.4K engagements

@jxmnop
/creator/twitter::jxmnop