SELECTIVE VICTIM CACHING

Selective victim caching is an improvement over Victim caching, to decrease the miss rates in caches, especially directly mapped caches. In case of Victim Caching, the conflicting blocks between the victim cache and the lower level L1 cache are always exchanged, when a hit occurs in victim cache and a miss occurs in L1 cache where as in Selective Victim Caching, interchanges are performed selectively according to a prediction algorithm based on the history of the block. Similarly, in case of a miss in both L1 and victim caches, the incoming block is always placed in L1 cache in Victim Caching, where as in Selective Victim Caching, the incoming block is placed either in victim cache or L1 cache according to the same prediction algorithm as mentioned above.

The prediction algorithm is based on the dynamic exclusion algorithm.The algorithm uses two bits associated with every cache block, namely the hit bit and the sticky bit. The hit bit is associated with the L1 cache block. A hit bit of "1" indicates that atleast one hit occured to the cache block while it was in L1 cache the last time. A sticky bit prevents thrashing and provides some inertia to a block. Details of the prediction algorithm can be found in the paper by Dimitrios Stiliadis and Anujan Varma: Selective Victim Caching:A method to improve the Performance of Direct-Mapped Caches.

In this project, I have tried to demonstrate the advantages of Selective Victim Caching over Victim Caching. User can make comparisons between Normal Caching, Victim Caching and Selective Victim Caching.

Click HERE to use the tool.

If you need help in using and understanding the tool, please click HERE.