The tensor-tensor product (t-product) [1] is a natural generalization of matrix multiplication. Based on t-product, many operations on matrix can be extended to tensor cases, including tensor SVD (see an illustration in the figure below), tensor spectral norm, tensor nuclear norm [2] and many others.
The linear algebraic structure of tensors are similar to the matrix cases. We have tensor-tensor product, tensor SVD, tensor inverse and some other reated concepts extended from matrices. The detailed definitions of these tensor concepts, operations and tensor factorizations are given at https://canyilu.github.io/publications/2018-software-tproduct.pdf. We develop a Matlab toolbox to implement several basic operations on tensors based on t-product. See a list of implemented functions in t-product toolbox 1.0 below.
The original t-product [1] uses the discrete Fourier transform and uses the fast Fourier transform (FFT) for efficient computing. It is further generlaized to the t-product under arbitrary invertible linear transform in [2]. Thus, all the concepts (e.g., tsvd, tensor inverse) of t-product under FFT can be generalized to t-product under general linear transforms. If the linear transform satisfies where , then we can define a more general tensor nuclear norm induced by the t-product under this linear transform. Then we develop a more general Matlab toolbox to implement t-product under general linear transform. See a list of implemented functions in t-product toolbox 2.0 below.
For the definitions of t-product and related concepts under linear transform, please refer to [2] and our works [6,7]. We will provide a document to give the details in the future.
Simply run the following routine to test all the above functions:
test.m
In citing this toolbox in your papers, please use the following references:
C. Lu. Tensor-Tensor Product Toolbox. Carnegie Mellon University, June 2018. https://github.com/canyilu/tproduct. C. Lu, J. Feng, Y. Chen, W. Liu, Z. Lin, and S. Yan. Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019. C. Lu, X. Peng, and Y. Wei. Low-Rank Tensor Completion With a New Tensor Nuclear Norm Induced by Invertible Linear Transforms. IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), 2019
The corresponding BiBTeX citations are given below:
@manual{lu2018tproduct, author = {Lu, Canyi}, title = {Tensor-Tensor Product Toolbox}, organization = {Carnegie Mellon University}, month = {June}, year = {2018}, note = {\url{https://github.com/canyilu/tproduct}} } @article{lu2018tensor, author = {Lu, Canyi and Feng, Jiashi and Chen, Yudong and Liu, Wei and Lin, Zhouchen and Yan, Shuicheng}, title = {Tensor Robust Principal Component Analysis with A New Tensor Nuclear Norm}, journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence}, year = {2019} } @inproceedings{lu2019tensor, author = {Lu, Canyi and Peng, Xi and Wei, Yunchao}, title = {Low-Rank Tensor Completion With a New Tensor Nuclear Norm Induced by Invertible Linear Transforms}, journal = {CVPR}, year = {2019} }
- Version 1.0 was released on June, 2018. It implements the functions of t-product and related concepts under fast Fourier transform.
- Version 2.0 was released on April, 2021. It implements the functions of t-product and related concepts under general invertible linear transform. The fast Fourier transform is the default transform.
- Most functions are direct generalization from the fast Fourier transform to general linear transform, e.g.,
tprod
,tran
,teye
,tinv
,tsvd
,tubalrank
,tsn
,tnn
,prox_tnn
andtqr
. - Some functions are new (not included in Version 1.0), e.g.,
basis_column
,basis_tube
andunit_eijk
. - Some functions in Version 1.0 are updated, e.g., the setting of parameter tol in
tubalrank
andtsvd
is updated, andtprod
,tsn
,tinv
andtqr
are updated.
- Most functions are direct generalization from the fast Fourier transform to general linear transform, e.g.,
The t-product toolbox has been applied in our works for tensor roubst PCA [3,4], low-rank tensor completion and low-rank tensor recovery from Gaussian measurements [5]. The t-product under linear transform has also been applied in tensor completion [6] and tensor robust PCA [7]. Some more models are included in LibADMM toolbox [8].
- Tensor robust principal component analysis
- Low tubal tensor completion and tensor recovery from Gaussian measurements
- Tensor robust PCA and tensor completion based on tensor nuclear norm under linear transform
- A Library of ADMM for Sparse and Low-rank Optimization
References
[1] | M. E. Kilmer and C. D. Martin. Factorization strategies for third-order tensors. Linear Algebra and its Applications. 435(3):641–658, 2011. |
[2] | M. E. Kilmer and S. Aeron. Tensor-Tensor Products with Invertible Linear Transforms. Linear Algebra and its Applications. 2015. |
[3] | C. Lu, J. Feng, Y. Chen, W. Liu, Z. Lin, and S. Yan. Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019. |
[4] | C. Lu, J. Feng, Y. Chen, W. Liu, Z. Lin, and S. Yan. Tensor robust principal component analysis: Exact recovery of corrupted low-rank tensors via convex optimization. In IEEE International Conference on Computer Vision and Pattern Recognition, 2016. |
[5] | C. Lu, J. Feng, Z. Lin, and S. Yan. Exact low tubal rank tensor recovery from Gaussian measurements. In International Joint Conference on Artificial Intelligence, 2018. |
[6] | C. Lu, X. Peng, and Y. Wei. Low-Rank Tensor Completion With a New Tensor Nuclear Norm Induced by Invertible Linear Transforms. IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), 2019. |
[7] | C. Lu. Exact Recovery of Tensor Robust Principal Component Analysis under Linear Transforms. arXiv preprint arXiv:1907.08288. 2019. |
[8] | C. Lu, J. Feng, S. Yan, Z. Lin. A Unified Alternating Direction Method of Multipliers by Majorization Minimization. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, pp. 527-541, 2018. |