Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

![rohanpaul_ai Avatar](https://lunarcrush.com/gi/w:24/cr:twitter::2588345408.png) Rohan Paul [@rohanpaul_ai](/creator/twitter/rohanpaul_ai) on x 74.2K followers
Created: 2025-07-14 03:41:06 UTC

This “stunning” proof by MIT computer scientist, is the first progress in XX years on one of the most famous questions in computer science.

Space complexity vs Time complexity.

New idea proves, that any algorithm that runs in T steps can be re-engineered to use about √T memory cells, establishing that memory (RAM) is a much stronger resource than earlier theory allowed.

A computer spends time (i.e. time complexity) running steps and spends memory (i.e. space complexity) holding data. 

Memory is the list of numbered slots inside RAM where a program keeps facts it will soon need again. Space complexity counts the largest number of slots in use at one moment. Time complexity counts the total steps the processor performs before the answer appears.

Think about sorting X M email addresses. A quick sort touches each address many times but only needs a handful of extra slots, so its time cost is high and its space cost is low. A counting sort can finish in fewer steps but must open a huge table in memory, so its space cost is high and its time cost is lower. Designers pick the mix that fits their hardware limits.

Ryan Williams of MIT showed that a program needing T steps can be rebuilt to need about √T cells of memory,

So, clever reuse of a tiny memory region (i.e. space in RAM) can replace the need to run for a very long time. In raw computational power, space is the heavier lever.

So following Ryan Williams's new idea, he rewrites the program so it grabs a much smaller block of RAM, then keeps reusing that block, wiping and refilling it over and over. 

Each wipe-and-refill adds extra steps, so the total run time balloons. In plain terms he trades speed for memory.

---
wired. com/story/for-algorithms-a-little-memory-outweighs-a-lot-of-time/

![](https://pbs.twimg.com/media/GvyYVuRbIAAdNQU.png)

XXXXXX engagements

![Engagements Line Chart](https://lunarcrush.com/gi/w:600/p:tweet::1944603034165838143/c:line.svg)

**Related Topics**
[mit](/topic/mit)

[Post Link](https://x.com/rohanpaul_ai/status/1944603034165838143)

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

rohanpaul_ai Avatar Rohan Paul @rohanpaul_ai on x 74.2K followers Created: 2025-07-14 03:41:06 UTC

This “stunning” proof by MIT computer scientist, is the first progress in XX years on one of the most famous questions in computer science.

Space complexity vs Time complexity.

New idea proves, that any algorithm that runs in T steps can be re-engineered to use about √T memory cells, establishing that memory (RAM) is a much stronger resource than earlier theory allowed.

A computer spends time (i.e. time complexity) running steps and spends memory (i.e. space complexity) holding data.

Memory is the list of numbered slots inside RAM where a program keeps facts it will soon need again. Space complexity counts the largest number of slots in use at one moment. Time complexity counts the total steps the processor performs before the answer appears.

Think about sorting X M email addresses. A quick sort touches each address many times but only needs a handful of extra slots, so its time cost is high and its space cost is low. A counting sort can finish in fewer steps but must open a huge table in memory, so its space cost is high and its time cost is lower. Designers pick the mix that fits their hardware limits.

Ryan Williams of MIT showed that a program needing T steps can be rebuilt to need about √T cells of memory,

So, clever reuse of a tiny memory region (i.e. space in RAM) can replace the need to run for a very long time. In raw computational power, space is the heavier lever.

So following Ryan Williams's new idea, he rewrites the program so it grabs a much smaller block of RAM, then keeps reusing that block, wiping and refilling it over and over.

Each wipe-and-refill adds extra steps, so the total run time balloons. In plain terms he trades speed for memory.


wired. com/story/for-algorithms-a-little-memory-outweighs-a-lot-of-time/

XXXXXX engagements

Engagements Line Chart

Related Topics mit

Post Link

post/tweet::1944603034165838143
/post/tweet::1944603034165838143