학술논문

Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification
Document Type
Periodical
Source
IEEE Transactions on Neural Networks and Learning Systems IEEE Trans. Neural Netw. Learning Syst. Neural Networks and Learning Systems, IEEE Transactions on. 35(5):6948-6962 May, 2024
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
General Topics for Engineers
Bayes methods
Mixture models
Computational modeling
Stochastic processes
Estimation
Hidden Markov models
Analytical models
Bayesian estimation
extended stochastic variational inference (SVI)
misuse intrusion detection
mixture models
network traffic classification
text categorization
Language
ISSN
2162-237X
2162-2388
Abstract
The finite inverted beta mixture model (IBMM) has been proven to be efficient in modeling positive vectors. Under the traditional variational inference framework, the critical challenge in Bayesian estimation of the IBMM is that the computational cost of performing inference with large datasets is prohibitively expensive, which often limits the use of Bayesian approaches to small datasets. An efficient alternative provided by the recently proposed stochastic variational inference (SVI) framework allows for efficient inference on large datasets. Nevertheless, when using the SVI framework to address the non-Gaussian statistical models, the evidence lower bound (ELBO) cannot be explicitly calculated due to the intractable moment computation. Therefore, the algorithm under the SVI framework cannot directly use stochastic optimization to optimize the ELBO, and an analytically tractable solution cannot be derived. To address this problem, we propose an extended version of the SVI framework with more flexibility, namely, the extended SVI (ESVI) framework. This framework can be used in many non-Gaussian statistical models. First, some approximation strategies are applied to further lower the ELBO to avoid intractable moment calculations. Then, stochastic optimization with noisy natural gradients is used to optimize the lower bound. The excellent performance and effectiveness of the proposed method are verified in real data evaluation.