Friday, December 30, 2011

Analogy of a Cache

For those of you who are tech challenged--from one of my favorite tech geeks. Enjoy

It's Analogy Time: How Caches Work
Caches are pretty simple to understand. Let's say you're running a library with thousands of books and thousands of users. Allowing everyone to roam freely would create a traffic nightmare so you make all requests for books go through your help desk. A customer approaches the desk, tells your employee what book they want, and your employee runs to grab the book. This happens for each request. The time it takes your employee to return with a book after receiving a request from a customer is your service time and it's a value that you want to keep as low as possible in order to prevent you losing your customers to another library.

Over time your employee may notice that certain books are frequently requested. A smart employee would decide to have copies of these books at the help desk, to more quickly service those requests. With a large enough desk, your employee could likely accommodate a good percentage of the requests that come by. At the end of the day, doing so would lower your service time and allow you to serve more customers. Requesting a book not located at the help desk would still take the same amount of time.

If reading patterns change over time, your employee could adapt. Assuming there's a finite amount of space at the help desk, books that are no longer as frequently read as they once were could be evicted from the desk and replaced with more popular titles.

The library I've described above is an example of a cache. The books are of course data, the customers are instructions and the help desk is the cache itself. Data that's accessed more frequently is stored in the cache and as access patterns change, so does the data in the cache.

No comments: