[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@fchollet Avatar @fchollet François Chollet

François Chollet posts on X about to the, agi, science, future the most. They currently have XXXXXXX followers and 1220 posts still getting attention that total XXXXXXX engagements in the last XX hours.

Engagements: XXXXXXX #

Engagements Line Chart

Mentions: XX #

Mentions Line Chart

Followers: XXXXXXX #

Followers Line Chart

CreatorRank: XXXXXXX #

CreatorRank Line Chart

Social Influence

Social category influence technology brands #7304 social networks stocks

Social topic influence to the, agi #128, science #3806, future, ai #2123, intro, should be, i dont care, threshold, flow

Top assets mentioned Alphabet Inc Class A (GOOGL)

Top Social Posts

Top posts by engagements in the last XX hours

"The 3rd edition of my book Deep Learning with Python is being printed right now and will be in bookstores within X weeks. You can order it now from Amazon or from Manning. This time we're also releasing the whole thing as a XXX% free website. I don't care if it reduces book sales I think it's the best deep learning intro around and more people should be able to read it"
X Link 2025-09-18T14:01Z 591.1K followers, 687.4K engagements

"The bottleneck for deep skill isn't usually intelligence but boredom tolerance. Learning has an activation energy: below a certain skill threshold practice is tedious but above it it becomes a self-sustaining flow state. The entire battle is persisting until that transition"
X Link 2025-10-30T18:23Z 591.1K followers, 241.8K engagements

"The Keras community video meeting is happening today at 10am PT (in X hr XX min). Join to get updates on the development roadmap and ask questions to the Keras team. URL in next tweet"
X Link 2025-12-05T16:50Z 591.1K followers, 14.4K engagements

"Congrats to the ARC Prize 2025 winners The Grand Prize remains unclaimed but nevertheless 2025 saw remarkable progress on LLM-driven refinement loops both with "local" models and with commercial frontier models. We also saw the rise of zero-pretraining DL approaches like HRM and TRM. Lots of new learnings"
X Link 2025-12-05T18:32Z 591.1K followers, 73.8K engagements

"Satya invited Google to the party to "make it dance" But then it turned out Google was a level-99 kinesthetic savant that mastered breakdancing on the spot and stole the whole show"
X Link 2025-11-29T17:00Z 590.9K followers, 148.8K engagements

"To perfectly understand a phenomenon is to perfectly compress it to have a model of it that cannot be made any simpler. If a DL model requires millions parameters to model something that can be described by a differential equation of three terms it has not really understood it it has merely cached the data"
X Link 2025-12-03T14:55Z 591.1K followers, 123.1K engagements

"Unironically I think part of the reason why the ancient Greeks invented science and philosophy is because they spoke Greek. Syntax is a catalyst of semantics"
X Link 2025-12-08T01:42Z 591.1K followers, 216.5K engagements

"NVIDA chips are manufactured by TSMC a Taiwanese company. They're created using EUV lithography machines manufactured by ASML a Dutch company. These machines consist of XX% of German parts (by value) in particular ZEISS optics"
X Link 2025-08-25T20:38Z 591.1K followers, 2.2M engagements

"The essence of great software is the quality of its abstractions. Great abstractions are indefinitely composable and stackable. They become a dependable foundation for future work future thought. The best ones are so robust you never have to revisit them -- you can forget them and build forward without ever looking back. Achieving this is the greatest productivity hack in software engineering. Because the greatest productivity drain is backtracking"
X Link 2025-11-01T21:32Z 590.9K followers, 208.5K engagements

"All the great breakthroughs in science are at their core compression. They take a complex mess of observations and say "it's all just this simple rule". Symbolic compression specifically. Because the rule is always symbolic -- usually expressed as mathematical equations. If it isn't symbolic you haven't really explained the thing. You can observe it but you can't understand it"
X Link 2025-11-14T14:30Z 591.1K followers, 13.2M engagements

"Software engineering has been within X months of being dead continually since early 2023"
X Link 2025-11-25T15:30Z 591.1K followers, 382.3K engagements

"Black Friday deal for Deep Learning with Python (3rd edition): XX% off just today. Go buy it:"
X Link 2025-11-28T18:41Z 591.1K followers, 191.3K engagements

"My prediction of Waymo covering XX% of the US by eoy 2028 is looking good"
X Link 2025-12-03T19:31Z 590.9K followers, 88.4K engagements

"Either you crack general intelligence -- the ability to efficiently acquire arbitrary skills on your own -- or you don't have AGI. A big pile of task-specific skills memorized from handcrafted/generated environments isn't AGI not matter how big"
X Link 2025-12-04T19:04Z 591.1K followers, 114.6K engagements

"Back in 2019 ARC X had one goal: to focus the attention of AI researchers towards the biggest bottleneck on the way to generality the ability to adapt to novelty on the fly which was entirely missing from the legacy deep learning paradigm. Six years later the field has responded. With test-time adaptation we finally have reasoning models capable of genuine fluid intelligence. While ARC X is now saturating SotA models are not yet human-level on an efficiency basis. Meanwhile ARC X remains largely unsaturated showing these models are still operating far below the upper bound of human-level"
X Link 2025-12-11T18:24Z 591.1K followers, 111.8K engagements