Where does NVRAM Fit?
What is NVRAM? Quite simply, it’s DRAM or SRAM that has a back-up flash memory a small controller, and a battery or super-capacitor. During operation the DRAM or SRAM is used in a system the same way that any DRAM or SRAM would be used. When power is interrupted the controller moves all of the data from the DRAM or SRAM to the flash using the backup power from the battery or super-capacitor. When power is restored, the controller moves the contents of the flash back into the SRAM or DRAM and the processor can resume operation where it left off.
In some ways it’s storage and in some ways it’s memory, so I had a hard time deciding whether to put this post in The Memory Guy or in The SSD Guy. In the end I decided that the impact of this technology would be greater on storage, so that’s how it ended up here!
Simtek and Xicor were early producers of NVRAMs, introducing monolithic NOVRAM chips in the 1980s. These devices merged SRAM with an EEPROM. Xicor stopped manufacturing the product in 2000 but Simtek added flash-backed DRAM modules to its product line under the name AgigA Tech then was acquired by Cypress Semiconductor.
Like any DIMM-style module, a DRAM + flash NVDIMM (a DIMM-format NVRAM) is relatively simple to manufacture, so a number of other companies now make competing products, including Micron, Netlist, SMART Modular, and Viking. PMC Sierra just announced at the Flash Memory Summit a variation on this design, using the PCIe interface and the larger PCIe card format to produce something that can either be viewed as an extraordinarily fast SSD or as a means of adding more DRAM to a system than the system can normally support.
But let’s get back to the original question: Where does it fit?
Those who know The SSD Guy well have heard me say that the reason that flash fits in computing systems is because it’s slower than DRAM but faster than an HDD, and it’s cheaper than DRAM but more costly than an HDD. Any memories that don’t fit into this hierarchy will sell into a smaller niche market in which OEMs will pay more for some special feature.
NVDIMMs are more expensive than either DRAM or an HDD and are no faster than DRAM. That automatically relegates this product to a niche consisting of those OEMs are willing to pay more for nonvolatility. To date this has included certain very sensitive applications like casino games and Automatic bank Teller Machines (ATMs). When these systems suffer a power outage it is important that they resume operation exactly where they left off as if the power outage hadn’t happened. Producers of these systems and their customers are willing to pay more for this ability.
But there is another area in which NVDIMMs have created interest lately. We all know that DRAM will eventually stop scaling, and when it does it will be replaced by some other technology. All of the likely replacement technologies are nonvolatile. In anticipation of this IBM has already focused its research efforts on ways to take advantage of this nonvolatility, and has coined the term “Storage Class Memory” (SCM) to describe the use of nonvolatile main memory in a computing environment.
NVDIMMs are being used as a prototype for SCM, and standards organizations like SNIA are using NVDIMMs as a tool to help them define programming protocols for SCM environments. SNIA has, in fact, already released its initial NVM Programming Model based on inputs from NVDIMM makers and other interested parties.
So, where do NVRAMs fit? There are two places: In applications where seamless recovery from a power failure is worth the extra cost, and in systems used to develop tomorrow’s computing architectures.