ALVES, R. G. O.; http://lattes.cnpq.br/6631983070915152; ALVES, Ruan Gomes de Oliveira.
Resumen:
Caching is crucial for improving the performance of Web services.
The key factor that determines the quality of caching is its capacity.
If the cache capacity is too small, caching performance (e.g. hit
ratio) is impacted. However, one cannot simply add more capacity
to improve performance. If cache capacity grows beyond a certain
point, caching performance is not improved, and the added capacity
is wasted. For a deined capacity, caching systems apply a replacement
policy that decides which items must be removed when the
cache is full. The literature has a lot of cache replacement policies. In
this work, we take another direction. We analyze how to be thrifty
using cache capacity based on admission control policies. Instead
of deciding which items should be removed from the cache, the
admission control decides whether an item should enter the cache.
We considered LARC, a well-known admission policy that allows
items to be cached only in their second access. We evaluated this
cache policy by simulating a request trace of a large e-commerce
web service. The results indicate that when the cache is oversized,
e.g. its capacity is larger than the optimal, the admission control
policy reduces the cache usage up to 50%, with a small 5% performance
penalty. On the other hand, when the cache is undersized,
the admission control policy increases the cache performance up to
22% with no cost on cache usage increase. We believe these results
indicate that there is potential for integrating admission control
into cache management.