[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]  Rohan Paul [@rohanpaul_ai](/creator/twitter/rohanpaul_ai) on x 74K followers Created: 2025-07-13 17:28:43 UTC Another research showing how LLM+price time-series data is helping trading strategies 👇 LLMoE adaptive routing for trading strategies The LLM-Based Routing in Mixture-of-Experts (LLMoE) framework replaces a conventional softmax router with a language model that chooses between “optimistic” and “pessimistic” sub-experts after reading both price time-series and headline text. On MSFT data from 2006-2016 the approach lifts total return to XXXXX % versus XXXXX % for a classic MoE and raises the Sharpe ratio accordingly, while maintaining full interpretability through the router’s text rationale.  XXXXXX engagements  **Related Topics** [msft](/topic/msft) [Post Link](https://x.com/rohanpaul_ai/status/1944448919347642817)
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
Rohan Paul @rohanpaul_ai on x 74K followers
Created: 2025-07-13 17:28:43 UTC
Another research showing how LLM+price time-series data is helping trading strategies 👇
LLMoE adaptive routing for trading strategies
The LLM-Based Routing in Mixture-of-Experts (LLMoE) framework replaces a conventional softmax router with a language model that chooses between “optimistic” and “pessimistic” sub-experts after reading both price time-series and headline text. On MSFT data from 2006-2016 the approach lifts total return to XXXXX % versus XXXXX % for a classic MoE and raises the Sharpe ratio accordingly, while maintaining full interpretability through the router’s text rationale.
XXXXXX engagements
Related Topics msft
/post/tweet::1944448919347642817