학술논문

Explicit Renyi Entropy for Hidden Markov Models
Document Type
Conference
Source
2020 IEEE International Symposium on Information Theory (ISIT) Information Theory (ISIT), 2020 IEEE International Symposium on. :2303-2308 Jun, 2020
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Signal Processing and Analysis
Language
ISSN
2157-8117
Abstract
Determining entropy rates of stochastic processes is a fundamental but difficult problem, with closed-form solutions known only for specific cases. This paper pushes the state-of-the-art by solving the problem for Hidden Markov Models (HMMs) and Renyi entropies. While computation of Renyi entropy for Markov chains reduces to studying the growth of a simple matrix product, computations for HMMs involve products of random matrices. As a result, this case is much harder and no explicit formulas have been known so far.In the finite-sample regime we circumvent this issue for Renyi entropy of integer orders, reducing the problem again to single matrix products where the matrix is built from transition and emission probabilities by means of tensor products. To obtain results in the asymptotic setting, we use a novel technique for determining the growth of non-negative matrix powers. The classical approach – Frobenius-Perron theory – requires positivity assumptions; we instead work directly with the spectral formula. As a consequence, our results do not suffer from limitations such as irreducibility and aperiodicity. This improves our understanding of the entropy rate even for standard (unhidden) chains.A recently published side-channel attack against RSA was proven effective using our result.