학술논문

The Application of Bayesian Penalty Regression in Sparse Regularization
Document Type
Conference
Author
Source
2020 International Conference on Big Data & Artificial Intelligence & Software Engineering (ICBASE) ICBASE Big Data & Artificial Intelligence & Software Engineering (ICBASE),2020 International Conference on. :158-165 Oct, 2020
Subject
Computing and Processing
Buildings
Predictive models
Data models
Bayes methods
Software engineering
regularization
shrinkage priors
Gibbs sampler
hierarchical models
convex penalty regression
non-convex penalty regression
Language
Abstract
High-dimensional sparse data is prone to overfitting problems when building regression models, and regularization is a classic and effective method, which includes ridge regression, Lasso regression, and elastic net. In the Bayesian framework, the penalty term is derived from specific shrinkage priors, and hierarchical models and the Gibbs sampler are used in the simulation. In this paper, we use the concrete slump test dataset to establish the regression model and apply the Bayesian convex penalty regression and non-convex penalty regression to screen the variables in the model. The result is compared to the classic regularization method by evaluating the model based on the prediction results, and finally, we explain the advantages of the Bayesian models over the ordinary regularization method.