SSDs use a huge number of internal parameters to achieve a tricky balance between performance, wear, and cost. The SSD Guy likes to compare this to a recording studio console like the one in this post’s graphic to emphasize just how tricky it is for SSD designers to find the right balance. Imagine trying to manage all of those knobs! (The picture is JacoTen’s Wikipedia photo of a Focusrite console.)
Vendors who produce differentiated SSDs pride themselves in their ability to fine-tune these parameters to achieve better performance or endurance than competing products.
About a year ago I suggested to the folks at NVMdurance that they might consider applying their machine learning algorithm to this problem. (The original NVMdurance product line was described in a Memory Guy post a while ago.) After all, the company makes a machine learning engine that tunes the numerous internal parameters of a NAND flash chip to extend the chip’s life while maintaining the specified performance. SSD management would be a natural use of machine learning since both SSDs and NAND flash chips currently use difficult and time-consuming manual processes to find the best mix of parameters to drive the design.
Little did I know that NVMdurance’s researchers had already embarked on this path, and were designing a product that was just released this month.
The NVMdurance Aviator is a machine learning program that helps solve one very tricky part of SSD design: Creating the Log-Likelihood Ratio tables (LLR) for the sophisticated LDPC error correction that has become necessary for today’s NAND flash chips. NVMdurance tells me that LLR tables can only be generated by deeply characterizing the flash chip and then translating that characterization data into error models that are then used to generate application-specific LLR tables for the LDPC engine.
The company expects for its customers to reduce their time to market by using machine learning to develop these tables rather than leaving the task to costly experts using very complex and time consuming manual methods. Being first to market is extraordinarily important in the field of technology.
My company, Objective Analysis, sees machine learning as a solution to many of the problems faced in storage and chips today. We know that machine learning is already used to schedule product movement through semiconductor fabs, but it is also very likely to help with problems managing the storage hierarchy. Our reports on the SSD market stress that much superior performance is achieved through automatic data placement than through manual data placement in hybrid storage systems, and this point is made obvious by benchmarks that compare the two approaches.
This is a solution that is likely to find good acceptance over time. As SSDs and NAND flash chips become more complex it will only make sense to reduce the effort required to get them to market by letting machine learning do the iterative work.
The product is in its formative phases: The initial offering of the NVMdurance Aviator works with Micron’s first generation 3D TLC NAND. Over time it is likely to be ported to a wide range of competing NAND flash chips.