We consider the problem of decomposing higher-order moment tensors, i.e., the sum of symmetric outer products of data vectors. Such a decomposition can be used to estimate the means in a Gaussian mixture model and for other applications in machine learning. The $d$th-order empirical moment tensor of a set of $p$ observations of $n$ variables is a symmetric $d$-way tensor. Our goal is to find a low-rank tensor approximation comprising $r \ll p$ symmetric outer products. The challenge is that forming the empirical moment tensors costs $O(pn^d)$ operations and $O(n^d)$ storage, which may be prohibitively expensive; additionally, the algorithm to compute the low-rank approximation costs $O(n^d)$ per iteration. Our contribution is avoiding formation of the moment tensor, computing the low-rank tensor approximation of the moment tensor implicitly using $O(pnr)$ operations per iteration and no extra memory. This advance opens the door to more applications of higher-order moments since they can now be efficiently computed. We present numerical evidence of the computational savings and show an example of estimating the means for higher-order moments.

Type

Publication

Date

Sep 2020

Tags

Links

Citation

S. Sherman, T. G. Kolda.
**Estimating Higher-Order Moments Using Symmetric Tensor Decomposition**.
*SIAM Journal on Matrix Analysis and Applications*, Vol. 41, No. 3, pp. 1369-1387, 19 pages,
2020.
https://doi.org/10.1137/19m1299633

higher-order moments, higher-order cumulants, Gaussian mixture models, symmetric tensor decomposition, implicit tensor formation

```
@article{ShKo20,
author = {Samantha Sherman and Tamara G. Kolda},
title = {Estimating Higher-Order Moments Using Symmetric Tensor Decomposition},
journal = {SIAM Journal on Matrix Analysis and Applications},
volume = {41},
number = {3},
pages = {1369--1387},
pagetotal = {19}
month = {September},
year = {2020},
doi = {10.1137/19m1299633},
eprint = {1911.03813},
}
```