학술논문
Some Generalized Information and Divergence Generating Functions: Properties, Estimation, Validation and Applications
Document Type
Working Paper
Author
Source
Subject
Language
Abstract
We propose R\'enyi information generating function and discuss its properties. A connection between the R\'enyi information generating function and the diversity index is proposed for discrete type random variables. The relation between the R\'enyi information generating function and Shannon entropy of order $q>0$ is established and several bounds are obtained. The R\'enyi information generating function of escort distribution is derived. Furthermore, we introduce R\'enyi divergence information generating function and discuss its effect under monotone transformations. We present non-parametric and parametric estimators of the R\'enyi information generating function. A simulation study is carried out and a real data relating to the failure times of electronic components is analyzed. A comparison study between the non-parametric and parametric estimators is made in terms of the standard deviation, absolute bias, and mean square error. We have observed superior performance for the newly proposed estimators. Some applications of the proposed R\'enyi information generating function and R\'enyi divergence information generating function are provided. For three coherent systems, we calculate the values of the R\'enyi information generating function and other well-established uncertainty measures and similar behaviour of the R\'enyi information generating function is observed. Further, a study regarding the usefulness of the R\'enyi divergence information generating function and R\'enyi information generating function as model selection criteria is conducted. Finally, three chaotic maps are considered and then used to establish a validation of the proposed information generating function.