[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
@danpiemont Dan PiemontDan Piemont posts on X about ai, solar, constellation, at least the most. They currently have XXXXX followers and XX posts still getting attention that total XXX engagements in the last XX hours.
Social topic influence ai, solar, constellation, at least, $10b, $125mm, inference, usable
Top posts by engagements in the last XX hours
"Long live the king of S-Band emitters"
X Link 2025-12-08T21:47Z 1609 followers, XXX engagements
"My view on datacenters in space. It absolutely works under one circumstance: you are already operating Starlink or a similar constellation. In that case all of the CAPEX (bus launch ground segment) is already bought and paid for by telecom users. Heat rejection should already be sized close to full solar output: say at least XX% of total power is usable for AI tasks. You can add AI compute to each satellite to perform AI tasks during periods of low utilization for each bus which could be 70%+. Your unit economics for adding AI then becomes basically just the cost of the chips plus some de"
X Link 2025-12-10T19:58Z 1630 followers, 54.3K engagements
"By the way the telecom biz is still the bigger one by far. Assuming Starlink is at about $10B in revenue right now with 8000 satellites that's $1.25MM per year per satellite. To estimate AI revenue per satellite assume they're all Starlink v2 with X kW solar. Assume a XX% load factor that's XXX kW usable power for AI enough to power XX 70W L4 GPUs for inference. XX% uptime gives 300k GPU-hours per year worth about $100k-$300k in annual AI revenue per satellite. A nice add-on biz and billions per year in revenue for a constellation of Starlink scale but not too attractive on a standalone"
X Link 2025-12-10T22:03Z 1609 followers, 3564 engagements