학술논문

Estimating buffer overflows in three stages using cross-entropy
Document Type
Conference
Source
Proceedings of the Winter Simulation Conference Winter simulation conference Simulation Conference, 2002. Proceedings of the Winter. 1:301-309 vol.1 2002
Subject
Computing and Processing
Computational modeling
Asynchronous transfer mode
Monte Carlo methods
Discrete event simulation
Computer science
Mathematics
Industrial engineering
Engineering management
High performance computing
Switches
Language
Abstract
In this paper we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level; finally, the tilting parameter just found is used to estimate the overflow probability of interest. We recognize three distinct properties of the method which together explain why the method works well; we conjecture that they hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.