Multilinear principal component analysisMultilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA) that is used to analyze M-way arrays, also informally referred to as "data tensors". M-way arrays may be modeled by linear tensor models, such as CANDECOMP/Parafac, or by multilinear tensor models, such as multilinear principal component analysis (MPCA) or multilinear independent component analysis (MICA). The origin of MPCA can be traced back to the tensor rank decomposition introduced by Frank Lauren Hitchcock in 1927;[1] to the Tucker decomposition;[2] and to Peter Kroonenberg's "3-mode PCA" work.[3] In 2000, De Lathauwer et al. restated Tucker and Kroonenberg's work in clear and concise numerical computational terms in their SIAM paper entitled "Multilinear Singular Value Decomposition",[4] (HOSVD) and in their paper "On the Best Rank-1 and Rank-(R1, R2, ..., RN ) Approximation of Higher-order Tensors".[5] Circa 2001, Vasilescu and Terzopoulos reframed the data analysis, recognition and synthesis problems as multilinear tensor problems. Tensor factor analysis is the compositional consequence of several causal factors of data formation, and are well suited for multi-modal data tensor analysis. The power of the tensor framework was showcased by analyzing human motion joint angles, facial images or textures in terms of their causal factors of data formation in the following works: Human Motion Signatures[6] (CVPR 2001, ICPR 2002), face recognition – TensorFaces,[7][8] (ECCV 2002, CVPR 2003, etc.) and computer graphics – TensorTextures[9] (Siggraph 2004). Historically, MPCA has been referred to as "M-mode PCA", a terminology which was coined by Peter Kroonenberg in 1980.[3] In 2005, Vasilescu and Terzopoulos introduced the Multilinear PCA[10] terminology as a way to better differentiate between linear and multilinear tensor decomposition, as well as, to better differentiate between the work[6][7][8][9] that computed 2nd order statistics associated with each data tensor mode(axis), and subsequent work on Multilinear Independent Component Analysis[10] that computed higher order statistics associated with each tensor mode/axis. Multilinear PCA may be applied to compute the causal factors of data formation, or as signal processing tool on data tensors whose individual observation have either been vectorized,[6][7][8][9] or whose observations are treated as a collection of column/row observations, "data matrix" and concatenated into a data tensor. The main disadvantage of this approach is that rather than computing all possible combinations MPCA computes a set of orthonormal matrices associated with each mode of the data tensor which are analogous to the orthonormal row and column space of a matrix computed by the matrix SVD. This transformation aims to capture as high a variance as possible, accounting for as much of the variability in the data associated with each data tensor mode(axis). The algorithmThe MPCA solution follows the alternating least square (ALS) approach. It is iterative in nature. As in PCA, MPCA works on centered data. Centering is a little more complicated for tensors, and it is problem dependent. Feature selectionMPCA features: Supervised MPCA is employed in causal factor analysis that facilitates object recognition[11] while a semi-supervised MPCA feature selection is employed in visualization tasks.[12] ExtensionsVarious extension of MPCA:
References
External links
|