I have just added a new white paper onto the Objective Analysis website: Matching Flash to the Processor – Why Multithreading Needs Parallelized Flash.
This document examines the evolution of today’s CPUs, whose clock frequencies have stopped increasing, but now exploit parallelism to scale performance. Multiple DRAM channels have also been added to performance computing to add parallelism to the memory channel.
Storage hasn’t kept pace with this move to parallelism and that is limiting today’s systems.
New NAND flash DIMMs recently introduced by Diablo, SanDisk, and IBM, provide a reasonable approach to adding parallel flash to a system on the its fastest bus – the memory channel. This white paper shows that storage can be scaled to match the processor’s growing performance by adding flash DIMMs to each of the many DRAM buses in a performance server.
The white paper is downloadable for free from the Objective Analysis home page. Have a look.
I can already hear readers saying: “Wait! You can’t do that!” Well, you’re right, but the new module comes awfully close to that by putting the NAND behind an ASIC that interfaces between the DDR3 bus and the NAND.
Why do this? Quite simply because you can get more “Bang for the Buck” by adding NAND to the system once you’ve reached a certain DRAM size. The Diablo “Memory Channel Storage” (MCS) approach supports the addition of terabytes of NAND at the loss of Continue reading