At last month’s SNIA Persistent Memory Summit Oracle presenter Jia Shi, Sr. Director of Exadata Development, shared some statistics on the Exadata system’s history over the past ten years. (Click on the graphic to the left to see the timeline.) The speaker highlighted the fact that the system’s I/O performance has grown from 0.05 million IOPS ten years ago to 16 million IOPS today, a 320X improvement! Shi said that Continue reading “Does Persistent Memory Improve Performance? Ask Oracle!”
Tag: Storage Class Memory
SNIA Webcast: Emerging Memories
On Tuesday, January 14, Tom Coughlin and I were featured in a BrightTalk webinar hosted by the Storage Networking Industry Association (SNIA). A recording of this webinar has been posted so that you can view it at your convenience.
This webinar looks at emerging memories and where they now stand, giving a Continue reading “SNIA Webcast: Emerging Memories”
Where does NVRAM Fit?
There’s been a lot of interest in NVRAM recently. This technology has been lurking in the background for decades, and suddenly has become very popular.
What is NVRAM? Quite simply, it’s DRAM or SRAM that has a back-up flash memory a small controller, and a battery or super-capacitor. During operation the DRAM or SRAM is used in a system the same way that any DRAM or SRAM would be used. When power is interrupted the controller moves all of the data from the DRAM or SRAM to the flash using the backup power from the battery or super-capacitor. When power is restored, the controller moves the contents of the flash back into the SRAM or DRAM and the processor can resume operation where it left off.
In some ways it’s storage and in some ways it’s memory, so Continue reading “Where does NVRAM Fit?”
White Paper: Matching Flash to the Processor
I have just added a new white paper onto the Objective Analysis website: Matching Flash to the Processor – Why Multithreading Needs Parallelized Flash.
This document examines the evolution of today’s CPUs, whose clock frequencies have stopped increasing, but now exploit parallelism to scale performance. Multiple DRAM channels have also been added to performance computing to add parallelism to the memory channel.
Storage hasn’t kept pace with this move to parallelism and that is limiting today’s systems.
New NAND flash DIMMs recently introduced by Diablo, SanDisk, and IBM, provide a reasonable approach to adding parallel flash to a system on the its fastest bus – the memory channel. This white paper shows that storage can be scaled to match the processor’s growing performance by adding flash DIMMs to each of the many DRAM buses in a performance server.
The white paper is downloadable for free from the Objective Analysis home page. Have a look.