Until now, computer memory chips are divided into two sorts: the fast kind, which is great for doing computations on faster than you can blink, but can't store information once power goes, and the slow kind, which is great for storing information but not fast enough to perform heavy computations on. As a result, a whole computer and information architecture has been built around caching – the art of putting the information you'll need on the 'fast' memory, keeping the information you (hopefully) don't on the 'slow' memory, and synchronizing the two every now and then.
Although it still has a very steep price tag, Intel's latest memory chip has the potential to meet the best of both worlds: stable memory storage which can be accessed and updated rapidly. And when tomorrow's supercomputers won't have to worry about caching penalties anymore, we can expect better performance, faster analyses, and new algorithms for working with big data optimized for this new class of optane memory chips.
Has Intel Invented a Universal Memory Tech?
The mysterious XPoint memory in Intel’s new Optane solid-state drive is a step toward universal memory