학술논문

Fresh Caching of Dynamic Contents using Restless Multi-armed Bandits
Document Type
Working Paper
Source
Subject
Computer Science - Networking and Internet Architecture
Language
Abstract
We consider a dynamic content caching framework; contents are getting updated at the central server, and a subset of contents are cached at the local cache associated with a Base station (BS). When a request comes, based on whether the content is in the local cache, the BS can decide whether to fetch the content from the central server or serve the cached version from the local cache. Fetching a content incurs a fixed fetching cost, and serving the cached version incurs ageing cost, proportional to the age-of-version (AoV) of the content. AoV is a freshness metric that counts the number of updates at the central server since the content is being fetched. We aim to minimize the average costs (fetching cost and ageing cost) subject to cache capacity constraints. This cost minimization problem is a continuous time restless multiarmed bandit process (RMAB). The single content problem of the corresponding RMAB is a partially observable Markov decision process (POMDP) since the BS can only see the AoV of the cached contents if it fetches the content. We reformulate the POMDP as a semi-Markov decision process and provide a Whittle index based solution to this problem. Finally, we compare the performance with recent work and show that our proposed policy is optimal via simulations.
Comment: 14 pages, 7 figures