Multilevel caching

Caching

Caching has been successfully deployed to improve Web performance for static content, caching proxies keep local copies of frequently requested resources, allowing large organizations significantly reduce their upstream bandwidth usage and cost, while significantly increasing performance and there by reduces the Load or traffic in the internet. it would 'cache' the first request from the remote web server for later use, and make everything as fast as possible.

MULTILEVEL CACHING Multilevel cache is using more than one level of cache implementation in order to make the speed of cache access almost equal to the speed of the CPU and to hold a large number of cache objects. Based on the content provided in the both the level of the cache it can be classified into two major categories.


 * Mutual Inclusion
 * Mutual exclusion

Mutual inclusion

Multilevel Inclusion is a technique which contains data that are common to both the levels, i.e that some data is present in both the most recent data cache level 1 and recently used data at level 0. Normally level 0 cache size is made smaller when compared to that of the level 1 cache.

The main disadvantage of this implementation is that it suggests and uses the two different location for the same data which results in waste of memory space. This concept of multilevel caching already exists and used by AMD in their microprocessor chips. The main problem with this type of caching is that for the same data we need two different levels of cache and also it is a waste of memory. This will also result in removal of any cache results in updating of both levels which is a tedious job to perform.

Mutual Exclusion

Multilevel Exclusion is a technique where only the data which is not in the level 1 are present in the smaller cache of level 0. This has a greater advantage of having less associativity between the two level cache. Such a concept is successfully used in all recent Pentium processors. It has an advantage of using smaller memory consumption and easy cache updating policy.