[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@New_Equinox Avatar @New_Equinox New_Equinox

New_Equinox posts on Reddit about meta, latent the most. They currently have undefined followers and XX posts still getting attention that total XXX engagements in the last XX hours.

Engagements: XXX #

Engagements Line Chart

Mentions: X #

Mentions Line Chart

Followers: undefined #

Followers Line Chart

CreatorRank: XXXXXXXXX #

CreatorRank Line Chart

Social Influence #


Social category influence technology brands

Social topic influence meta, latent

Top Social Posts #


Top posts by engagements in the last XX hours

"Mta introduces Continuous Learning via Sparse Memory Finetuning: A new method that uses Sparse Attention to Finetune only knowledge specific Parameters pertaining to the input leading to much less memory loss than standard Finetuning with all it's knowledge storing capability"
Reddit Link @New_Equinox 2025-10-22T19:52Z X followers, 1770 engagements

"Robotic warfare is gonna be for the 21st century what Nuclear Bombs were for the 20th"
Reddit Link @New_Equinox 2025-10-18T17:10Z X followers, XXX engagements

"(Meta) The Free Transformer: An improvement to Transformers adding a Latent Random Variable to the decoder allowing the model to decide in a hidden state how it guides its output before it predicts the next token. +3% Compute overhead +30% GSM8K +35% MBPP and +40% HumanEval+ on a 1.5B Model"
Reddit Link @New_Equinox 2025-10-22T21:18Z X followers, 1332 engagements

"(Meta) "EncodeThinkDecode (ETD): Scaling reasoning through recursive latent thoughts." Improving the reasoning of base models by training them to iterate over a subset of reasoning-critical NN layers during mid-training. Modest improvements on Math Benchmarks (+36% on Math with OLMo X 1B)"
Reddit Link @New_Equinox 2025-10-22T21:37Z X followers, XXX engagements