- Aug 17, 2014
- 11,108
Earlier today at the Flash Memory Summit 2022 (FMS2022) event, Samsung unveiled its new CXL-based "Memory-semantic" SSDs. While SSDs utilizing DRAM cache are not new, the new Memory-semantic SSDs, as stated above, will utilize the super-fast CXL (Compute Express Link) interface.
As such, Samsung promises a 20x or a 1900% improvement to random read speeds and latency. This is exciting since random speeds are generally the Achilles heel of most SSDs, especially SATA-based ones with no DRAM.
In the announcement press release for FMS2022, Samsung says:
Samsung announced its ‘Memory-semantic SSD’ that combines the benefits of storage and DRAM memory. Leveraging Compute Express Link (CXL) interconnect technology and a built-in DRAM cache, Memory-semantic SSDs can achieve up to a 20x improvement in both random read speed and latency when used in AI and ML applications. Optimized to read and write small-sized data chunks at dramatically faster speeds, Samsung’s Memory-semantic SSDs will be ideal for the growing number of AI and ML workloads that require very fast processing of smaller data sets.
Samsung says its CXL "Memory-semantic" SSDs are up to 20x faster in random performance
Samsung announced today its new Memory-semantic SSDs. These new SSDs are based on the CXL interface standard and promise to offer up to 20x or 1900% improvement in random reads, and more.
www.neowin.net