Figure 15-3 shows the effect of data caching on a series of random select statements that are executed over a period of time. If the cache is empty initially, the first select statement is guaranteed to require disk I/O. You have to be sure to adequately size the data cache for the number of transactions you expect against the database.
As more queries are executed and the cache is being filled, there is an increasing probability that one or more page requests can be satisfied by the cache, thereby reducing the average response time of the set of retrievals.
Once the cache is filled, there is a fixed probability of finding a desired page in the cache from that point forward.
Figure 15-3: Effects of random selects on the data cache
If the cache is smaller than the total number of pages that are being accessed in all databases, there is a chance that a given statement will have to perform some disk I/O. A cache does not reduce the maximum possible response time—some query may still need to perform physical I/O for all of the pages it needs. But caching decreases the likelihood that the maximum delay will be suffered by a particular query—more queries are likely to find at least some of the required pages in cache.