학술논문

Recursive Neural Networks for Density Estimation Over Generalized Random Graphs
Document Type
Periodical
Source
IEEE Transactions on Neural Networks and Learning Systems IEEE Trans. Neural Netw. Learning Syst. Neural Networks and Learning Systems, IEEE Transactions on. 29(11):5441-5458 Nov, 2018
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
General Topics for Engineers
Encoding
Estimation
Probabilistic logic
Probability density function
Neural networks
Clustering algorithms
Data models
Density estimation
graph neural network (GNN)
random graph (RG)
recursive neural network (RNN)
structured data
Language
ISSN
2162-237X
2162-2388
Abstract
Structured data in the form of labeled graphs (with variable order and topology) may be thought of as the outcomes of a random graph (RG) generating process characterized by an underlying probabilistic law. This paper formalizes the notions of generalized RG (GRG) and probability density function (pdf) for GRGs. Thence, a “universal” learning machine (combining the encoding module of a recursive neural network and a radial basis functions’ network) is introduced for estimating the unknown pdf from an unsupervised sample of GRGs. A maximum likelihood training algorithm is presented and constrained so as to ensure that the resulting model satisfies the axioms of probability. Techniques for preventing the model from degenerate solutions are proposed, as well as variants of the algorithm suitable to the tasks of graphs classification and graphs clustering. The major properties of the machine are discussed. The approach is validated empirically through experimental investigations in the estimation of pdfs for synthetic and real-life GRGs, in the classification of images from the Caltech Benchmark data set and molecules from the Mutagenesis data set, and in clustering of images from the LabelMe data set.