[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
@ESYudkowsky
"Hey so I realize that macroeconomics is scary but this important note: - AI is not currently producing tons of real goods - Huge datacenter investments are functionally just throwing money around - So curbing AI wouldn't crash the economy IF the Fed then lowered rates"
X Link @ESYudkowsky 2025-09-25T02:48Z 210.9K followers, 214K engagements
"The goal of a CiteCheck benchmark should not be to check "Does the cited document say something related to the claim" but "Does the document state or very directly support the exact claim it's being cited about" Here are two failures from the last generation of LLMs that made an impression on me: Case 1: I was asking about pharmacokinetics of a medication I was considering asking for. The LLM delivered a sage report which included the claim that peak blood concentration happened X hours after oral consumption. Given that the drug had a relatively short half-life that didn't make sense to"
X Link @ESYudkowsky 2025-10-06T15:44Z 211K followers, 12.9K engagements
"There comes a day in every man's life when he asks an Artificial Intelligence how he can prove that he isn't the Antichrist"
X Link @ESYudkowsky 2025-10-08T22:44Z 210.9K followers, 80.4K engagements
"Fellow Chicagoan @Pontifex does your Church perchance have an office that can certify me as unlikely to be the Antichrist or not currently meeting those conditions"
X Link @ESYudkowsky 2025-10-08T22:44Z 210.9K followers, 39K engagements
""If Anyone Builds It Everyone Dies" is now out. Read it today if you want to see with fresh eyes what's truly there before others try to prime your brain to see something else instead"
X Link @ESYudkowsky 2025-09-16T20:32Z 210.9K followers, 304.6K engagements
"The thing about AI successionists is that they think they've had the incredible unshared insight that silicon minds could live their own cool lives and that humans aren't the best possible beings. They are utterly closed to hearing about how you could KNOW THAT and still disgree on the factual prediction that this happy outcome happens by EFFORTLESS DEFAULT when they cobble together a superintelligence. They are so impressed with themselves for having the insight that human life might not be 'best' that they are not willing to sit down and have the careful conversation about what exactly is"
X Link @ESYudkowsky 2025-10-05T05:54Z 210.9K followers, 79.6K engagements
"I've previously found LLMs to suck at "Track down cited pages/references and see if they support the citer's claim." LLMs hallucinate what the cited document says if the citer's claim sounds LLM-plausible. I wish a CiteCheck benchmark for this kind of task would get put together by someone so labs would try to get higher scores on it. Improving LLMs' abilities to check whether claims are supported by citations might help mitigate some of the damage that social media and slop are doing to our ability to live in a shared reality. Also then we get to run the resulting citation-checker over"
X Link @ESYudkowsky 2025-10-06T15:44Z 210.9K followers, 40.3K engagements
"Some people said this was their favorite / first actually-liked interview. By Chris Williamson of Modern Wisdom"
X Link @ESYudkowsky 2025-10-27T02:37Z 210.9K followers, 52.4K engagements
"Nate Soares and I are publishing a traditional book: If Anyone Builds It Everyone Dies: Why Superhuman AI Would Kill Us All. Coming in Sep 2025. You should probably read it Given that we'd like you to preorder it Nowish"
X Link @ESYudkowsky 2025-05-14T17:49Z 211K followers, 1.4M engagements
"The slur "doomer" was an incredible propaganda success for the AI death cult. Please do not help them kill your neighbors' children by repeating it"
X Link @ESYudkowsky 2025-10-31T17:01Z 210.9K followers, 102.2K engagements