Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

![ClementDelangue Avatar](https://lunarcrush.com/gi/w:24/cr:twitter::186420551.png) clem 🤗 [@ClementDelangue](/creator/twitter/ClementDelangue) on x 149.1K followers
Created: 2025-07-25 17:00:16 UTC

All closed-source frontier labs use tons of open-source all over the stack, starting from python, @PyTorch, @huggingface all the way down to RoPE, GQA, flash-attention, or any tiny improvements released by open-source players. 

The whole transformers architecture (the T in gpT) comes from an open research paper and open-source model from 2017:

This is ok but would be much nicer if they would acknowledge the contributions & contribute back more!


XXXXX engagements

![Engagements Line Chart](https://lunarcrush.com/gi/w:600/p:tweet::1948790414179230032/c:line.svg)

**Related Topics**
[gpt](/topic/gpt)
[stack](/topic/stack)

[Post Link](https://x.com/ClementDelangue/status/1948790414179230032)

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

ClementDelangue Avatar clem 🤗 @ClementDelangue on x 149.1K followers Created: 2025-07-25 17:00:16 UTC

All closed-source frontier labs use tons of open-source all over the stack, starting from python, @PyTorch, @huggingface all the way down to RoPE, GQA, flash-attention, or any tiny improvements released by open-source players.

The whole transformers architecture (the T in gpT) comes from an open research paper and open-source model from 2017:

This is ok but would be much nicer if they would acknowledge the contributions & contribute back more!

XXXXX engagements

Engagements Line Chart

Related Topics gpt stack

Post Link

post/tweet::1948790414179230032
/post/tweet::1948790414179230032