VARIATIONAL BAYESIAN PARTIALLY OBSERVED NON-NEGATIVE TENSOR FACTORIZATION

Non-negative matrix and tensor factorization (NMF/NTF) have become important tools for extracting part based representations in data. It is however unclear when an NMF or NTF approach is most suited for data and how reliably the models predict when trained on partially observed data. We presently ex...

Full description

Saved in:
Bibliographic Details
Published in:2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP) pp. 1 - 6
Main Authors: Hinrich, Jesper L., Nielsen, Soren F. V., Madsen, Kristoffer H., Morup, Morten
Format: Conference Proceeding
Language:English
Published: IEEE 01-09-2018
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Non-negative matrix and tensor factorization (NMF/NTF) have become important tools for extracting part based representations in data. It is however unclear when an NMF or NTF approach is most suited for data and how reliably the models predict when trained on partially observed data. We presently extend a recently proposed variational Bayesian NMF (VB-NMF) to non-negative tensor factorization (VB-NTF) for partially observed data. This admits bi- and multi-linear structure quantification considering both model prediction and evidence. We evaluate the developed VB-NTF on synthetic and a real dataset of gene expression in the human brain and contrast the performance to VB-NMF and conventional NMF/NTF. We find that the gene expressions are better accounted for by VB-NMF than VB-NTF and that VB-NMF/VB-NTF more robustly handle partially observed data than conventional NMF/NTF. In particular, probabilistic modeling is beneficial when large amounts of data is missing and/or the model order over-specified.
DOI:10.1109/MLSP.2018.8516924