Cache
A cache is a small partial copy of a larger collection of data, implemented in such a way that access to the data copy in the cache is faster than to that in the large, full store.
The first caches were interposed between a CPU and main memory, but they are now used all over the place; some examples are disk caches, where copies of commonly-used disk blocks are kept in main memory, and network caches, where copies of data read from the network are kept on the local machine's disk.
Caches need algorithms to decide i) what items to keep in the cache (and how to structure the cache), and ii) for fixed/limited-size caches, what item currently in the cache to discard to make room when a new item is to be 'cached'. This topic has periodically been the subject of extensive research.
For caches which contain items which can be modified as well as read, an algorithm is needed to decide if, how and when to update the original copy of the data; in some circumstances (e.g. a multi-processor, with each CPU having its own cache of main memory) this can be a complicated question. Again, a subject of much research over the decades.