[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.] #  @dstrbtd_ai DSTRBTD DSTRBTD posts on X about edges, miner, bittensor, year of the most. They currently have XXX followers and X posts still getting attention that total XX engagements in the last XX hours. ### Engagements: XX [#](/creator/twitter::1940453713027416064/interactions)  - X Week XX -XX% - X Month XXXXX -XX% ### Mentions: X [#](/creator/twitter::1940453713027416064/posts_active)  - X Week X -XX% - X Month X +20% ### Followers: XXX [#](/creator/twitter::1940453713027416064/followers)  - X Week XXX +0.13% - X Month XXX +2% ### CreatorRank: undefined [#](/creator/twitter::1940453713027416064/influencer_rank)  ### Social Influence **Social category influence** [cryptocurrencies](/list/cryptocurrencies) XXXXX% **Social topic influence** [edges](/topic/edges) 33.33%, [miner](/topic/miner) 33.33%, [bittensor](/topic/bittensor) 33.33%, [year of](/topic/year-of) 33.33%, [decentralized](/topic/decentralized) 33.33%, [the first](/topic/the-first) XXXXX% **Top accounts mentioned or mentioned by** [@gonzalo_fuente1](/creator/undefined) [@primeintellect](/creator/undefined) [@obsessedfan5](/creator/undefined) [@gasdankca](/creator/undefined) **Top assets mentioned** [Bittensor (TAO)](/topic/bittensor) ### Top Social Posts Top posts by engagements in the last XX hours "DSTRBTD has been registered as Bittensor's Subnet XX since Sept 2024. Our mission Solving the Trust-less Incentivised Distributed Training Problem. Here's how we got started:" [X Link](https://x.com/dstrbtd_ai/status/1942212323734651191) 2025-07-07T13:21Z XXX followers, 4449 engagements "DSTRBTD's Run X has now produced a 1.1B parameter model that edges ahead of comparable distributed training efforts on 2/3 LM evaluation benchmarks (HellaSwag PIQA ARC-E) for models in a similar size range" [X Link](https://x.com/dstrbtd_ai/status/1960339335749792125) 2025-08-26T13:51Z XXX followers, 23.4K engagements "This week marks X year of DSTRBTD operating as Subnet XX on Bittensor tackling the decentralized distributed training problem. In that time weve run the first Hivemind-based p2p training on Bittensor upgraded our aggregation mechanism to OpenDiLoCo and scaled to training a 1.1B parameter model with competitive lm-eval scores. As we enter year two were carrying forward lessons from our first year on chain and focusing on: faster experimentation stronger open-source contributions growing our research team and steadily scaling model size" [X Link](https://x.com/dstrbtd_ai/status/1966516393093791746) 2025-09-12T14:56Z XXX followers, 9342 engagements
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
@dstrbtd_ai DSTRBTDDSTRBTD posts on X about edges, miner, bittensor, year of the most. They currently have XXX followers and X posts still getting attention that total XX engagements in the last XX hours.
Social category influence cryptocurrencies XXXXX%
Social topic influence edges 33.33%, miner 33.33%, bittensor 33.33%, year of 33.33%, decentralized 33.33%, the first XXXXX%
Top accounts mentioned or mentioned by @gonzalo_fuente1 @primeintellect @obsessedfan5 @gasdankca
Top assets mentioned Bittensor (TAO)
Top posts by engagements in the last XX hours
"DSTRBTD has been registered as Bittensor's Subnet XX since Sept 2024. Our mission Solving the Trust-less Incentivised Distributed Training Problem. Here's how we got started:"
X Link 2025-07-07T13:21Z XXX followers, 4449 engagements
"DSTRBTD's Run X has now produced a 1.1B parameter model that edges ahead of comparable distributed training efforts on 2/3 LM evaluation benchmarks (HellaSwag PIQA ARC-E) for models in a similar size range"
X Link 2025-08-26T13:51Z XXX followers, 23.4K engagements
"This week marks X year of DSTRBTD operating as Subnet XX on Bittensor tackling the decentralized distributed training problem. In that time weve run the first Hivemind-based p2p training on Bittensor upgraded our aggregation mechanism to OpenDiLoCo and scaled to training a 1.1B parameter model with competitive lm-eval scores. As we enter year two were carrying forward lessons from our first year on chain and focusing on: faster experimentation stronger open-source contributions growing our research team and steadily scaling model size"
X Link 2025-09-12T14:56Z XXX followers, 9342 engagements
/creator/twitter::dstrbtd_ai