학술논문

Bounding Preemption Delay within Data Cache Reference Patterns for Real-Time Tasks
Document Type
Conference
Source
12th IEEE Real-Time and Embedded Technology and Applications Symposium (RTAS'06) Real-Time and Embedded Technology and Applications Symposium, 2006. Proceedings of the 12th IEEE. :71-80 2006
Subject
Computing and Processing
Timing
Interference
Delay effects
Upper bound
Data analysis
Equations
Real time systems
Computer science
Embedded system
Computer architecture
Language
ISSN
1545-3421
Abstract
Caches have become invaluable for higher-end architectures to hide, in part, the increasing gap between processor speed and memory access times. While the effect of caches on timing predictability of single real-time tasks has been the focus of much research, bounding the overhead of cache warm-ups after preemptions remains a challenging problem, particularly for data caches. In this paper, we bound the penalty of cache interference for real-time tasks by providing accurate predictions of the data cache behavior across preemptions. For every task, we derive data cache reference patterns for all scalar and non-scalar references. Partial timing of a task is performed up to a preemption point using these patterns. The effects of cache interference are then analyzed using a settheoretic approach, which identifies the number and location of additional misses due to preemption. A feedback mechanism provides the means to interact with the timing analyzer, which subsequently times another interval of a task bounded by the next preemption. Our experimental results demonstrate that it is sufficient to consider the n most expensive preemption points, where n is the maximum possible number of preemptions. Further, it is shown that such accurate modeling of data cache behavior in preemptive systems significantly improves the WCET predictions for a task. To the best of our knowledge, our work of bounding preemption delay for data caches is unprecedented.