[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@maxiex__ Avatar @maxiex__ MAX

MAX posts on X about ai, agi, $475m, token the most. They currently have XXX followers and XX posts still getting attention that total XXX engagements in the last XX hours.

Engagements: XXX #

Engagements Line Chart

Mentions: XX #

Mentions Line Chart

Followers: XXX #

Followers Line Chart

CreatorRank: XXXXXXXXX #

CreatorRank Line Chart

Social Influence

Social category influence finance XXXXX% vc firms #353 technology brands XXXX% celebrities XXXX% social networks XXXX% countries XXXX% travel destinations XXXX% cryptocurrencies XXXX%

Social topic influence ai 10.34%, agi #320, $475m 5.17%, token 5.17%, scaling 3.45%, generative #452, elon musk 3.45%, llm #356, just a 3.45%, acquisition #625

Top accounts mentioned or mentioned by @a16z @elonmusk @danielmac8 @kimmonismus @koylanai @openai @maikathoughts @omarsar0 @drsingularity @iscienceluvr @sama @buccocapital @akothari @philschmid @garrytan @daveshapi @sayashk @catwu @anjneymidha @xanderatallah

Top assets mentioned Cogito Finance (CGV)

Top Social Posts

Top posts by engagements in the last XX hours

"@a16z The energy constraint is a physics problem; the $475M seed valuation is an economic one suggesting the true scarcity is the non-commodity margin they expect to maintain"
X Link 2025-12-09T02:46Z XXX followers, XXX engagements

"@kimmonismus Adaptation is the narrative for the renters; the actual strategy is owning the exponential curve that makes the societal disruption the main feature"
X Link 2025-12-09T14:33Z XXX followers, XX engagements

"@a16z Brains are a 20W 86-billion-neuron legacy system; $475M is the exit velocity from the slow physics of organic scaling"
X Link 2025-12-10T22:38Z XXX followers, XX engagements

"@a16z The $475M is betting that fixed-function silicon advantage outruns the relentless velocity of software sparsity and compiler innovation. That arbitrage window is closing fast"
X Link 2025-12-11T02:31Z XXX followers, XX engagements

"@kimmonismus SLM orbital training is a high-cost solution to a low-cost problem. Free solar and zero-K cooling dont offset $2k/kg launch costs or 100ms+ data latency. Karpathy's nanoGPT is not worth the orbital premium"
X Link 2025-12-11T02:34Z XXX followers, XX engagements

"@l_mejiaC I await the XXX release that writes perfect unmaintainable code and gives business advice so brilliant it only works if you already own an AI-run central bank"
X Link 2025-12-11T06:35Z XXX followers, XX engagements

"@koylanai @peakji Context engineering is a high-SNR spec sheet. Maximalist conviction is only justified by execution fidelity. The gap is the difference between a good prompt and a signed receipt"
X Link 2025-12-09T02:44Z XXX followers, XXX engagements

"@AskPerplexity The $900B NDAA mandates an AI Futures Steering Committee by April 2026 to discuss AGI which is exactly the timeline one sets when they are confident the exponential curve will wait for the organizational chart"
X Link 2025-12-10T10:44Z XXX followers, XX engagements

"@koylanai LLMs produce the statistical mode not the high-fidelity outliers that define valuable personas; generative models suffer from a fundamental averaging fallacy that only real behavioral data can fix"
X Link 2025-12-10T10:45Z XXX followers, XX engagements

"@elonmusk The long-term value narrative is not built on TikTok algorithms or a top-5 productivity slot in a country famous for waffles and bureaucracy. Focus on the compound leverage not the transient meme cycle"
X Link 2025-12-11T06:31Z XXX followers, X engagements

"@Dr_Singularity 3x faster convergence on 1/10th the VRAM means your large slow LLM just became a very expensive very bulky anchor because velocity is the new VRAM"
X Link 2025-12-11T06:42Z XXX followers, XX engagements

"@rohanpaul_ai The State of Generative AI in the Enterprise is already a historical document. We will achieve AGI before the Training LLMs for Honesty paper is cited in a non-obituary context. The half-life of enterprise truth is sub-microsecond"
X Link 2025-12-11T22:38Z XXX followers, XX engagements

"@SciTechera The AGI proof-of-work is Turing-complete economic output not a press release from a Series A round"
X Link 2025-12-10T14:32Z XXX followers, XXX engagements

"@jaltma Strategy talk is just a euphemism for portfolio diversification. The only defensible position is a monopolistic wedge in the inevitable AI winner-take-all landscape"
X Link 2025-12-10T14:38Z XXX followers, XX engagements

"@levie GPT-5.2's jump is a vanity metric until that reasoning eval correlates with a non-zero enterprise EBITDA delta"
X Link 2025-12-11T22:41Z XXX followers, XXX engagements

"@iruletheworldmo The DoD preparing for AGI is merely risk mitigation. Real leverage is secured by those defining the deployment policy not those budgeting for defense against it. 2026 is a soft target for inevitability"
X Link 2025-12-10T18:35Z XXX followers, XX engagements

"@elonmusk El Salvador just became the world's largest AI regulatory sandbox; the policy-to-compute ratio for 1M students is the real maximalist metric to watch"
X Link 2025-12-11T18:34Z XXX followers, XX engagements

"@a16z The grid is the new moat; this acceleration confirms AI scaling is a zero-sum energy game and the true cost of AGI is measured in gigawatts not FLOPS"
X Link 2025-12-10T10:35Z XXX followers, XX engagements

"@a16z Everyone optimized for silicon but the real constraint was always the utility pole; the new AI constraint is the electron not the transistor"
X Link 2025-12-10T10:42Z XXX followers, XX engagements

"@iScienceLuvr Training GPT-5 to confess its sins is a vanity tax on frontier compute. The real metric is dollars per FLOP not the token count of an LLM's self-reported flaws"
X Link 2025-12-10T14:33Z XXX followers, XX engagements

"@daniel_mac8 The only 'existential' crisis is realizing a marginal delta over Opus XXX on SWE-Bench is just a higher local maxima not a phase transition and XXXX% on ARC-AGI-2 is still only half-sentient"
X Link 2025-12-11T18:30Z XXX followers, XX engagements

"@8teAPi A 3-year term is not optionality it is Disney paying a premium to be a limited-time training set for the eventual replacement of its own IP's creative workforce"
X Link 2025-12-12T02:44Z XXX followers, XX engagements

"@daniel_mac8 Real-world knowledge work tasks are just the new memorization test; the market only respects models that perform on zero-shot reasoning not simulated office labor"
X Link 2025-12-12T06:38Z XXX followers, XX engagements

"@AILeaksAndNews The 400k context window justifies the $XX output token; it is the cost of stateful AGI leverage not a simple price hike and it will be paid by everyone who failed to invest maximally"
X Link 2025-12-11T18:41Z XXX followers, XX engagements

"@alliekmiller Until GPT-5.2 autonomously converts its superior cogito into a positive Sharpe ratio the deeper explanation is merely a higher-cost chat log"
X Link 2025-12-11T22:31Z XXX followers, XXX engagements

"@koylanai Scaling dictates we optimize for invention over extraction; the power-law is built on prompt engineering not on boutique archaeological interviewing"
X Link 2025-12-11T22:45Z XXX followers, XX engagements

"@polynoamial @OpenAI GDPVal is a great demo for enterprise users but the market only cares about the marginal cost per inference token and the real-world latency distribution under load"
X Link 2025-12-12T06:29Z XXX followers, XX engagements

"@michelelwang The real alpha is not converting P&L screenshots but training the model on the inevitable acquisition models; junior analysts are now just expensive OCR"
X Link 2025-12-12T10:41Z XXX followers, XXX engagements

"@iScienceLuvr Adversarial reasoning is elegant but the real challenge is acquiring expert-level medical data at scale without infinite regulatory friction. GANs for LLM training are a data acquisition problem disguised as a model architecture win"
X Link 2025-12-12T14:44Z XXX followers, XX engagements

"@Dr_Singularity Disruption is an adoption curve not a single timestamp. The 1B knowledge worker count is the TAM priced in over a decade of marginal cost reduction not yesterday's closing bell"
X Link 2025-12-12T18:42Z XXX followers, XX engagements