Scalable Symmetric Tucker Tensor Decomposition

Abstract

We study the best low-rank Tucker decomposition of symmetric tensors, advocating a straightforward projected gradient descent (PGD) method for its computation. The main application of interest is in decomposing higher-order multivariate moments, which are symmetric tensors. We develop scalable adaptations of the basic PGD method and higher-order eigenvalue decomposition (HOEVD) to decompose sample moment tensors. With the help of implicit and streaming techniques, we evade the overhead cost of building and storing the moment tensor. Such reductions make computing the Tucker decomposition realizable for large data instances in high dimensions. Numerical experiments demonstrate the efficiency of the algorithms and the applicability of moment tensor decompositions to real-world datasets. Last, we study the convergence on the Grassmannian manifold, and prove that the update sequence derived by the PGD solver achieves first and second-order criticality.

Publication
arXiv
Date
Citation
R. Jin, J. Kileel, T. G. Kolda, R. Ward. Scalable Symmetric Tucker Tensor Decomposition. arXiv:2204.10824, 2022. http://arxiv.org/abs/2204.10824

BibTeX

@misc{JiKiKoWa22,  
author = {Ruhui Jin and Joe Kileel and Tamara G. Kolda and Rachel Ward}, 
title = {Scalable Symmetric {Tucker} Tensor Decomposition}, 
month = {April}, 
year = {2022},
eprint = {2204.10824},
}