[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
Brian Cardarella posts on X about vml, javascript, chrome, $200month the most. They currently have XXXXX followers and XXX posts still getting attention that total XXX engagements in the last XX hours.
Social category influence agencies #185 finance XXX%
Social topic influence vml #32, javascript 0.9%, chrome 0.9%, $200month 0.9%, 200k 0.9%, instead of 0.9%, is there 0.9%, dems 0.9%, built on 0.9%, dom XXX%
Top accounts mentioned or mentioned by @elixirlang @leozh @bschultzer @theprimeagen @yordisprieto @zachsdaniel1 @dockyard @sergeymoiseev @bryanjbryce @thmsmlr @peterdedene @warreninthebuff @adolfont @miilad2025 @ziglang @sasajuric @doctorthe113 @lightninglu10 @kmdrfx @thdxr
Top posts by engagements in the last XX hours
"Our new back-end agnostic headless browser attached to Chrome remote debugger executing JavaScript on the native UI DOM. BTW that's running an older version of LiveView Native along with both a LiveCompont in one TabView and a nested LiveView in another"
X Link @bcardarella 2025-10-14T20:40Z 5017 followers, 2447 engagements
"The Claude pricing tiers are really confusing. I'm on Pro Max $200/month and the 200k limit is a major pain in the ass for what I'm using"
X Link @bcardarella 2025-10-12T18:23Z 5019 followers, XXX engagements
"@kmdrfx @thdxr I presume this was in reference to very very long chat histories in opencode. It's a common perf issue in nearly every UI framework. Virtualizing off screen content is usually the most common solution"
X Link @bcardarella 2025-10-13T21:02Z 5019 followers, XX engagements
"@thdxr If opentui doesn't/cannot virtualize what is off screen then the perf problems are going to persist. Even native UI frameworks like SwiftUI have to virtualize this stuff"
X Link @bcardarella 2025-10-13T18:40Z 5019 followers, XXX engagements
"This is a serious question. Is there something I'm not obviously getting"
X Link @bcardarella 2025-10-16T16:46Z 5017 followers, 1267 engagements
"@ryanrwinchester Hardi's install guide says no more than 3/8" flashing visible. They did 1.77""
X Link @bcardarella 2025-10-13T10:53Z 5018 followers, XXX engagements
"After spending the last two weeks codegen'ing with @ziglang I have some advice to LLM doubters: Pick a language or stack you know nothing about and give it a shot. For me I spent way too long trying to get the LLM to write @elixirlang how I want to write it. And that's where the mismatch was. Do I know if the most idiomatic Zig is being written No. Do I care No"
X Link @bcardarella 2025-10-14T18:11Z 5017 followers, 2347 engagements
"Just vibe coded a VML parser in @ziglang"
X Link @bcardarella 2025-10-09T22:54Z 5017 followers, 1312 engagements
"Kind of looks like a scene from The Walking Dead"
X Link @bcardarella 2025-10-15T13:06Z 5017 followers, XXX engagements
"We now have an XML spec compliant library written in @ziglang built on top of dom HTML is next then VML #Ziglang"
X Link @bcardarella 2025-10-12T08:57Z 5017 followers, 1888 engagements
"LLM Codegen against well written specs is kind of magical. However there are some optimizations I've had to do: X. fetch original spec document (usually in HTML) X. pandoc convert to markdown X. ask LLM to analyze and remove anything from the document that is unnecessary and optimize for tokenization Without losing spec fidelity: DOM: 760k tokens to 62k tokens HTML: 3.7M tokens to 750k"
X Link @bcardarella 2025-10-12T13:13Z 5018 followers, XXX engagements