New Proof Dramatically Compresses Space Needed for Computation
Briefly

Recent findings from MIT's Ryan Williams at the ACM Symposium indicate that tasks previously thought to require significant memory can be executed with far less. For decades, the conventional wisdom held that if a computational task took t steps, it correspondingly required about t bits of memory. However, Williams' research suggests that a task could potentially be solved with around 10 bits, drastically shifting our understanding of the efficiency of computations. This development has significant implications for computational complexity and the design of algorithms.
“This result shows the prior intuition is completely false,” Williams says. “I thought there must be something wrong [with the proof] because this is extremely unexpected.”
"The breakthrough relies on a reduction, a means of transforming one problem into another that may seem unrelated but is mathematically equivalent."
Read at www.scientificamerican.com
[
|
]