1. Representative Previous Work PCA LDA ISOMAP: Geodesic Distance Preserving J. Tenenbaum et al., 2000 LLE: Local Neighborhood Relationship Preserving S. Roweis & L. Saul, 2000 LE/LPP: Local Similarity Preserving, M. Belkin, P. Niyogi et al., 2001, 2003
2. Hundreds Dimensionality Reduction Algorithms Statistics-based Geometry-based … PCA/KPCA ISOMAP LLE LE/LPP … LDA/KDA Matrix Tensor Any common perspective to understand and explain these dimensionality reduction algorithms? Or any unified formulation that is shared by them? Any general tool to guide developing new algorithms for dimensionality reduction?
3. Our Answers Direct Graph Embedding Linearization Kernelization Original PCA & LDA, ISOMAP, LLE, Laplacian Eigenmap PCA, LDA, LPP KPCA, KDA Tensorization Type Formulation CSA, DATER Example S. Yan, D. Xu, H. Zhang and et al., CVPR, 2005, T-PAMI,2007
4. Direct Graph Embedding Intrinsic Graph: S, SP: Similarity matrix (graph edge) Similarity in high dimensional space L, B:Laplacian matrix from S, SP; Data in high-dimensional space and low-dimensional space (assumed as 1D space here): Penalty Graph
5. Direct Graph Embedding -- Continued Intrinsic Graph: S, SP: Similarity matrix (graph edge) L, B:Laplacian matrix from S, SP; Similarity in high dimensional space Data in high-dimensional space and low-dimensional space (assumed as 1D space here): Criterion to Preserve Graph Similarity: Penalty Graph Special case B isIdentity matrix (Scale normalization) Problem: It cannot handle new test data.
6. Linearization Intrinsic Graph Linear mapping function Penalty Graph Objective function in Linearization Problem: linear mapping function is not enough to preserve the real nonlinear structure?
7. Kernelization Intrinsic Graph Nonlinear mapping: the original input space to another higher dimensional Hilbert space. Penalty Graph Constraint: Kernel matrix: Objective function in Kernelization
8. Tensorization Low dimensional representation is obtained as: Intrinsic Graph Penalty Graph Objective function in Tensorization where
9. Common Formulation S, SP: Similarity matrix Intrinsic graph L, B:Laplacian matrix from S, SP; Penalty graph Direct Graph Embedding Linearization Kernelization Tensorization where
10. A General Framework for Dimensionality Reduction D: Direct Graph Embedding L:Linearization K: Kernelization T: Tensorization
11. New Dimensionality Reduction Algorithm: Marginal Fisher Analysis Important Information for face recognition: 1) Label information 2) Local manifold structure (neighborhood or margin) 1: ifxi is among the k1-nearest neighbors of xj in the same class; 0 :otherwise 1: if the pair (i,j) is among the k2 shortest pairs among the data set; 0: otherwise
14. Summary Optimization framework that unifies previous dimensionality reduction algorithms as special cases. A new dimensionality reduction algorithm: Marginal Fisher Analysis.
15.
16. 56 events are defined in LSCOMAirplane Flying Riot Existing Car Geometric and photometric variances Clutter background Complex camera motion and object motion More diverse !
17. Earth Mover’s Distance in Temporal Domain(T-MM, Under Review) Key Frames of two video clips in class “riot” EMD can efficiently utilize the information from multiple frames.
18.
19.
20.
21. Future Work Machine Learning Event Recognition Biometric Computer Vision Pattern Recognition Web Search Multimedia Content Analysis Multimedia
22. Acknowledgement Shuicheng Yan UIUC Steve Lin Microsoft Lei Zhang Microsoft Hong-Jiang Zhang Microsoft Shih-Fu Chang Columbia Xuelong Li UK Xiaoou Tang Hong Kong Zhengkai Liu, USTC
24. What is Gabor Features? Gabor features can improve recognition performance in comparison to grayscale features. Chengjun Liu T-IP, 2002 Five Scales … Input: Grayscale Image Eight Orientations Output: 40 Gabor-filtered Images Gabor Wavelet Kernels
25. How to Utilize More Correlations? Pixel Rearrangement Pixel Rearrangement Sets of highly correlated pixels Columns of highly correlated pixels Potential Assumption in Previous Tensor-based Subspace Learning: Intra-tensor correlations: Correlations among the features within certain tensor dimensions, such as rows, columns and Gabor features…
26. Tensor Representation: Advantages Enhanced Learnability 2. Appreciable reductions in computational costs 3. Large number of available projection directions 4. Utilize the structure information
27. Connection to Previous Work –Tensorface (M. Vasilescu and D. Terzopoulos, 2002) From an algorithmic view or mathematics view, CSA and Tensorface are both variants of Rank-(R1,R2,…,Rn) decomposition.