Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

# ![@davtbutler Avatar](https://lunarcrush.com/gi/w:26/cr:twitter::782529445781438464.png) @davtbutler Dave

Dave posts on X about inference, mainnet, docs, alibaba and the most. They currently have XXX followers and XX posts still getting attention that total XX engagements in the last XX hours.

### Engagements: XX [#](/creator/twitter::782529445781438464/interactions)
![Engagements Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::782529445781438464/c:line/m:interactions.svg)

- X Week XXXXX -XX%
- X Month XXXXX +121%
- X Months XXXXXX -XX%

### Mentions: X [#](/creator/twitter::782529445781438464/posts_active)
![Mentions Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::782529445781438464/c:line/m:posts_active.svg)

- X Week X -XX%
- X Month XX +200%
- X Months XX +314%

### Followers: XXX [#](/creator/twitter::782529445781438464/followers)
![Followers Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::782529445781438464/c:line/m:followers.svg)

- X Week XXX +5%
- X Month XXX +15%
- X Months XXX +91%

### CreatorRank: undefined [#](/creator/twitter::782529445781438464/influencer_rank)
![CreatorRank Line Chart](https://lunarcrush.com/gi/w:600/cr:twitter::782529445781438464/c:line/m:influencer_rank.svg)

### Social Influence [#](/creator/twitter::782529445781438464/influence)
---

**Social category influence**
[technology brands](/list/technology-brands) 

**Social topic influence**
[inference](/topic/inference), [mainnet](/topic/mainnet), [docs](/topic/docs), [alibaba and](/topic/alibaba-and), [$9988hk](/topic/$9988hk), [alibaba](/topic/alibaba), [beta](/topic/beta), [open ai](/topic/open-ai), [model x](/topic/model-x), [virtual](/topic/virtual)
### Top Social Posts [#](/creator/twitter::782529445781438464/posts)
---
Top posts by engagements in the last XX hours

"Posting for XX days things you dont know about Nillion: Day 4: nilDB mainnet has X decentralised nodes across X locations on X continents. They are run by X independent entities including Deutsche Telekom Saudi Telecom Alibaba and Vodafone. You can see them in the docs under nilDB nodes (mainnet) here"  
![@davtbutler Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::782529445781438464.png) [@davtbutler](/creator/x/davtbutler) on [X](/post/tweet/1946532973710344330) 2025-07-19 11:30:00 UTC XXX followers, XXX engagements


"Posting for XX days things you dont know about Nillion: Day 8: nilAI is so simple you can run private inference with a single API call - no setup required. Seriously. Try it yourself Even @juanaxyz00 can do it"  
![@davtbutler Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::782529445781438464.png) [@davtbutler](/creator/x/davtbutler) on [X](/post/tweet/1947999135698710548) 2025-07-23 12:36:00 UTC XXX followers, 1624 engagements


"Posting for XX days things you dont know about Nillion: Day 10: We did not want to use centralised AI meeting note takers @nillionnetwork so we built our own LouisAI. Personally I have recorded XX meetings this month and others internally use it on a daily basis. It is in closed beta right now let me know if you are keen for access. Always looking for feedback"  
![@davtbutler Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::782529445781438464.png) [@davtbutler](/creator/x/davtbutler) on [X](/post/tweet/1948731460527292566) 2025-07-25 13:06:00 UTC XXX followers, 1086 engagements


"I think the main issue will always be user expectation. The clear example is AI (LLMs) - people are really only interested in having the latest state of the art model to use and privacy is a side thought. So if model X is the latest from openAI and is 10/10 if we tried something under MPC you may get 1/10 whereas with a TEE you may get 6/10. Now one day for model X you may be able to use MPC but by that point it will actually be model X*1000 that is state of the art.and people will want that. So imo for the forseeable future it is going to be hard to sell "MPC AI" as the functionality will be"  
![@davtbutler Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::782529445781438464.png) [@davtbutler](/creator/x/davtbutler) on [X](/post/tweet/1949819792648548679) 2025-07-28 13:10:39 UTC XXX followers, XX engagements


"Yes would love to hear that feedback. For highly technical people for whom setting up and running local models (and capturing history etc) local may be the best option. But I really think for the average user they will struggle which is why I think web apps (or mobile) are virtual here. What models do you run locally at the moment"  
![@davtbutler Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::782529445781438464.png) [@davtbutler](/creator/x/davtbutler) on [X](/post/tweet/1950210644478668883) 2025-07-29 15:03:45 UTC XXX followers, XX engagements


"Hey. Sure. To best support the most popular use cases and workflows we encounter Nillion's PETs mix currently prioritises leveraging MPC for storage and TEEs for general-purpose compute. One of the most compelling and popular PETs applications is AI inference. Even with a general-purpose MPC protocol with no other overheads LLM inference cost is multiplied by the number of nodes. TEEs are an overwhelmingly more competitive choice cost-wise in such cases. So right now we are focussed on enabling real world production use cases with MPC at the storage layer and not at the computation layer"  
![@davtbutler Avatar](https://lunarcrush.com/gi/w:16/cr:twitter::782529445781438464.png) [@davtbutler](/creator/x/davtbutler) on [X](/post/tweet/1949762247363797376) 2025-07-28 09:21:59 UTC XXX followers, XX engagements

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

@davtbutler Avatar @davtbutler Dave

Dave posts on X about inference, mainnet, docs, alibaba and the most. They currently have XXX followers and XX posts still getting attention that total XX engagements in the last XX hours.

Engagements: XX #

Engagements Line Chart

  • X Week XXXXX -XX%
  • X Month XXXXX +121%
  • X Months XXXXXX -XX%

Mentions: X #

Mentions Line Chart

  • X Week X -XX%
  • X Month XX +200%
  • X Months XX +314%

Followers: XXX #

Followers Line Chart

  • X Week XXX +5%
  • X Month XXX +15%
  • X Months XXX +91%

CreatorRank: undefined #

CreatorRank Line Chart

Social Influence #


Social category influence technology brands

Social topic influence inference, mainnet, docs, alibaba and, $9988hk, alibaba, beta, open ai, model x, virtual

Top Social Posts #


Top posts by engagements in the last XX hours

"Posting for XX days things you dont know about Nillion: Day 4: nilDB mainnet has X decentralised nodes across X locations on X continents. They are run by X independent entities including Deutsche Telekom Saudi Telecom Alibaba and Vodafone. You can see them in the docs under nilDB nodes (mainnet) here"
@davtbutler Avatar @davtbutler on X 2025-07-19 11:30:00 UTC XXX followers, XXX engagements

"Posting for XX days things you dont know about Nillion: Day 8: nilAI is so simple you can run private inference with a single API call - no setup required. Seriously. Try it yourself Even @juanaxyz00 can do it"
@davtbutler Avatar @davtbutler on X 2025-07-23 12:36:00 UTC XXX followers, 1624 engagements

"Posting for XX days things you dont know about Nillion: Day 10: We did not want to use centralised AI meeting note takers @nillionnetwork so we built our own LouisAI. Personally I have recorded XX meetings this month and others internally use it on a daily basis. It is in closed beta right now let me know if you are keen for access. Always looking for feedback"
@davtbutler Avatar @davtbutler on X 2025-07-25 13:06:00 UTC XXX followers, 1086 engagements

"I think the main issue will always be user expectation. The clear example is AI (LLMs) - people are really only interested in having the latest state of the art model to use and privacy is a side thought. So if model X is the latest from openAI and is 10/10 if we tried something under MPC you may get 1/10 whereas with a TEE you may get 6/10. Now one day for model X you may be able to use MPC but by that point it will actually be model X*1000 that is state of the art.and people will want that. So imo for the forseeable future it is going to be hard to sell "MPC AI" as the functionality will be"
@davtbutler Avatar @davtbutler on X 2025-07-28 13:10:39 UTC XXX followers, XX engagements

"Yes would love to hear that feedback. For highly technical people for whom setting up and running local models (and capturing history etc) local may be the best option. But I really think for the average user they will struggle which is why I think web apps (or mobile) are virtual here. What models do you run locally at the moment"
@davtbutler Avatar @davtbutler on X 2025-07-29 15:03:45 UTC XXX followers, XX engagements

"Hey. Sure. To best support the most popular use cases and workflows we encounter Nillion's PETs mix currently prioritises leveraging MPC for storage and TEEs for general-purpose compute. One of the most compelling and popular PETs applications is AI inference. Even with a general-purpose MPC protocol with no other overheads LLM inference cost is multiplied by the number of nodes. TEEs are an overwhelmingly more competitive choice cost-wise in such cases. So right now we are focussed on enabling real world production use cases with MPC at the storage layer and not at the computation layer"
@davtbutler Avatar @davtbutler on X 2025-07-28 09:21:59 UTC XXX followers, XX engagements

creator/x::davtbutler
/creator/x::davtbutler