VARIATIONAL BAYESIAN TENSOR NETWORKS WITH STRUCTURED POSTERIORS
Kriton Konstantinidis, Yao Xu, Danilo Mandic, Qibin Zhao
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:05:29
Tensor network (TN) methods have proven their considerable potential in deterministic regression and classification related paradigms, but remain underexplored in probabilistic settings. To this end, we introduce a variational inference framework for supervised learning in the context of TNs, referred to as the Bayesian Tensor Network (BTN). This is achieved by making use of the multi-linear nature of tensor networks which allows us to construct a structured variational model which scales linearly with data dimensionality. The so imposed low rank structure on the tensor mean and Kronecker separability of the local covariances makes it possible to efficiently induce weight dependencies in the posterior distribution. This is shown to enhance model expressiveness at a drastically lower parameter complexity compared to the standard mean-field approach. A comprehensive validation of the proposed framework demonstrates the competitiveness of BTNs against existing structured Bayesian neural network approaches, while exhibiting enhanced interpretability, computational efficiency, and ability to yield credibility intervals.