RAID Cache

As part of my articles regarding RAID I want to focus on a very important RAID controller feature. That would be the memory cache.

The process of transferring data to and from disk storage includes storing the data temporarily in a memory cache located on the RAID (Redundant Array of Independent Disks) controller that is managing the data transfer. In other words, RAID controller cards temporarily cache data from the host system until it is successfully written to the storage media.

Memory cache is comprised of high-speed silicon memory DRAM chips. The access time for writing data to or reading data from DRAM (Dynamic Random Access Memory) is roughly 106, or a million times faster than the typical access time for writing directly to or reading directly from a set of disk drives. In a posted-write operation, as soon as the host computer writes data to the cache, the write operation is completed; and the host is freed up immediately to perform another operation. The host does not have to wait for the write data to be transferred to disk. Therefore, use of cache memory on RAID controllers significantly speeds up write operations and increases overall system performance. Current cache amount varies from 256MB, 512MB, 1GB, 2GB and even 4GB.

Another advantage the cache provides is, in case of a system failure and depending on the size of the buffer, the cache will hold information until the system can get back online which helps prevent or minimize data loss. Of course, the cache might lose charge eventually (within about 30 minutes) so RAID controllers usually come with a battery backup unit (BBU) to hold the charge longer until the system is brought back online and the cached data is extracted.

Without a memory cache the controller has to stop the flow of data until the drive has acknowledged that it is ready to receive. The controller will then receive the data from the operating system and forward it on to the drive that’s being written to. This process is very fast and can hardly be noticed but when dealing with millions and billions of these transactions everyday even nanoseconds can add up to noticeable levels. Memory cache allows the controller to go ahead and receive data from the operating system and place it into the buffer memory if the drive is not ready for it at that time; and as all business majors (and even non-business majors) understand, time is money.

I hope this was informative and will aid you in understanding when the benefits of high level RAID controllers can be utilized to your advantage.