A BRIEF INTRODUCTION TO CACHE LOCALITY YIN WEI DONG 14 SS

Preview:

Citation preview

A BRIEF INTRODUCTION TO CACHE LOCALITY

YIN WEI DONG

14 SS

THINGS TO BE DISCUSSED

• Types of locality

• Algorithms to improve data locality

• History of locality reference

• Thrashing

• Applications

TEMPORAL LOCALITY

• If at one point in time a particular memory location is referenced, then it is likely that the same location will be referenced again in the near future.

• To make efforts to store a copy of the referenced data in special memory storage, which can be accessed faster.

SPATIAL LOCALITY

• If a particular memory location is referenced at a particular time, then it is likely that nearby memory locations will be referenced in the near future.

• To attempt to guess the size and shape of the area around the current reference for which it is worthwhile to prepare faster access.

BRANCH LOCALITY

• Few amount of possible alternatives in the critical path

• Not a spatial locality since few possibilities can be located far away from each other.

OVERVIEW

• The average memory reference time is

++

= average memory reference time

m=miss ratio

=the latency: the time to reference the cache when there is a hit

=various secondary effects

Important factor:LATENCYHIT RATIO

EXAMPLE : LEAST RECENT USED

• Discards the least recently used items first

• expensive to always discards the least recently used item

• Implementation : keeping "age bits"

EXAMPLE : MOST RECENTLY USED

• Discards the most recently used items first.

• In contrast to LRU

• Better for random access patterns and repeated scans over large datasets

EXAMPLE : RANDOM REPLACEMENT(RR)

• Randomly selects a candidate item and discards it to make space when necessary

• Used in ARM processors for its simplicity

OPTIMAL SOLUTION IN THEORY

• The most efficient caching algorithm would be to always discard the information that will not be needed for the longest time in the future. 

• Not implementable in practice because it is impossible to predict.

• Used to compare the effectiveness of the actually chosen cache algorithm with the optimal solution after experimentation.

CACHE-OBLIVIOUS ALGORITHM

• designed to take advantage of a CPU cache without having the size of the caches as a parameter.

• Good performance on multiple machines with different cache sizes without modification

• Works by a recursive divide and conquer algorithm

HISTORY OF LOCALITY REFERENCE

Hints: The learning algorithm was controversial. It performed well on programs with well-defined loops and poorly on many other programs.

HISTORY OF LOCALITY REFERENCE

WHY THRASHING?

HISTORY OF LOCALITY REFERENCE

WHY THRASHING

• memory to disk : if the working set of a program or a workload cannot be effectively held within physical memory, then constant data swapping may occur

• CPU cache to memory : main memory is accessed in a pattern that leads to multiple main memory locations competing for the same cache lines, resulting in excessive cache misses and lack of performance

SOLUTIONS

• Increase the amount of RAM in the computer.

• Decrease the number of programs being run on the computer.

• Replace programs that are memory-heavy with equivalents that use less memory.

• Assign working priorities to programs, i.e. low, normal, high.

• Improve spatial locality

APPLICATIONS OF CACHE LOCALITY

• In Web browsers to hold recent Web pages.

• In search engines to find the most relevant responses to queries.

• In video boards to accelerate graphics displays

• In edge servers to hold recent Web pages accessed by anyone in an organization or geographical region

• The principle of cache locality will be useful wherever there is an advantage in reducing the apparent distance from a process to the objects it accesses

END

Recommended