dbo:abstract
|
- Within statistics, Multilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA). MPCA is employed in the analysis of n-way arrays, i.e. a cube or hyper-cube of numbers, also informally referred to as a "data tensor". N-way arrays may be decomposed, analyzed, or modeled by
* linear tensor models such as CANDECOMP/Parafac, or
* multilinear tensor models, such as multilinear principal component analysis (MPCA), or multilinear independent component analysis (MICA), etc. The origin of MPCA can be traced back to the Tucker decomposition and Peter Kroonenberg's "M-mode PCA/3-mode PCA" work. In 2000, De Lathauwer et al. restated Tucker and Kroonenberg's work in clear and concise numerical computational terms in their SIAM paper entitled "", (HOSVD) and in their paper "On the Best Rank-1 and Rank-(R1, R2, ..., RN ) Approximation of Higher-order Tensors". Circa 2001, Vasilescu reframed the data analysis, recognition and synthesis problems as multilinear tensor problems based on the insight that most observed data are the compositional consequence of several causal factors of data formation, and are well suited for multi-modal data tensor analysis. The power of the tensor framework was showcased by analyzing human motion joint angles, facial images or textures in terms of their causal factors of data formation in the following works: Human Motion Signatures(CVPR 2001, ICPR 2002), face recognition – ,(ECCV 2002, CVPR 2003, etc.) and computer graphics – (Siggraph 2004). Historically, MPCA has been referred to as "M-mode PCA", a terminology which was coined by Peter Kroonenberg in 1980. In 2005, Vasilescu and Terzopoulos introduced the Multilinear PCA terminology as a way to better differentiate between linear and multilinear tensor decomposition, as well as, to better differentiate between the work that computed 2nd order statistics associated with each data tensor mode(axis), and subsequent work on Multilinear Independent Component Analysis that computed higher order statistics associated with each tensor mode/axis. Multilinear PCA may be applied to compute the causal factors of data formation, or as signal processing tool on data tensors whose individual observation have either been vectorized, or whose observations are treated as matrix and concatenated into a data tensor. MPCA computes a set of orthonormal matrices associated with each mode of the data tensor which are analogous to the orthonormal row and column space of a matrix computed by the matrix SVD. This transformation aims to capture as high a variance as possible, accounting for as much of the variability in the data associated with each data tensor mode(axis). (en)
- 多線性主成分分析(Multilinear Principal Component Analysis,MPCA)方法,可將高維度空間映射到低維空間中去,降維的過程就是捨棄不重要的特徵向量縮減維度,相較於一般的主成分分析,多線性主成分分析保留了資料的結構性且有較佳的解釋比例。多線性主成分分析(MPCA)是主成分分析(PCA)到多維的一個延伸。PCA是投影向量(Vector)到向量,而MPCA是投影張量(Tensor)到張量,投影的結構相對簡單,另外運算在較低維度的空間進行,因此處理高維度數據時有低運算量的優勢。舉例來說,給一個100x100的圖片,主成分分析運做在1000x1的向量上,而多線性主成分分析則是在二階模式上運作100x1的向量。對於等量的降維來說,主成分分析需要估算的變數量為多線性主成分分析的49((10000/(100x2)-1))倍,因此在實用面上多線性主成分分析可以比主成分分析更有效率。 (zh)
|
rdfs:comment
|
- 多線性主成分分析(Multilinear Principal Component Analysis,MPCA)方法,可將高維度空間映射到低維空間中去,降維的過程就是捨棄不重要的特徵向量縮減維度,相較於一般的主成分分析,多線性主成分分析保留了資料的結構性且有較佳的解釋比例。多線性主成分分析(MPCA)是主成分分析(PCA)到多維的一個延伸。PCA是投影向量(Vector)到向量,而MPCA是投影張量(Tensor)到張量,投影的結構相對簡單,另外運算在較低維度的空間進行,因此處理高維度數據時有低運算量的優勢。舉例來說,給一個100x100的圖片,主成分分析運做在1000x1的向量上,而多線性主成分分析則是在二階模式上運作100x1的向量。對於等量的降維來說,主成分分析需要估算的變數量為多線性主成分分析的49((10000/(100x2)-1))倍,因此在實用面上多線性主成分分析可以比主成分分析更有效率。 (zh)
- Within statistics, Multilinear principal component analysis (MPCA) is a multilinear extension of principal component analysis (PCA). MPCA is employed in the analysis of n-way arrays, i.e. a cube or hyper-cube of numbers, also informally referred to as a "data tensor". N-way arrays may be decomposed, analyzed, or modeled by
* linear tensor models such as CANDECOMP/Parafac, or
* multilinear tensor models, such as multilinear principal component analysis (MPCA), or multilinear independent component analysis (MICA), etc. (en)
|