DEIM Forum 2010 D1-4 HOSVD 191-0065 6-6 E-mail: j.morigaki@gmail.com, katayama@tmu.ac.jp Lathauwer (HOSVD) (Tensor) HOSVD Savas HOSVD Sun HOSVD,, Higher Order Data Classification Method with Autocorrelation Matrix Abstract Correcting on HOSVD Junichi MORIGAKI and Kaoru KATAYAMA Graduate School of System design,tokyo Metropolitan University E-mail: j.morigaki@gmail.com, katayama@tmu.ac.jp We propose a classification method for tensor data with higher order processing based on Higher Order Singular Value Decomposition (HOSVD) by Lathauwer et al. Recently, due to increasing complexity of data, vectors and matrices are sometimes not suitable for describing the true nature of data. Tensor is a concept of describing data of higher order and dimensions. In data classification, feature quantity can be described by tensor. In our method, we improve the classification method by Savas et al. by autocorrelation matrices correcting on HOSVD and combinatorial subspace constitution method by Sun et al. Key words HOSVD, Image Classification, Higher Order Data Analysis 1. ( ) (HOSVD) [1] PARAFAC [2] Savas HOSVD [3] HOSVD HOSVD Savas MATLAB (ex. A(:, :, ν)). 2. HOSVD Wang [4] Vasilescu CG [5] [6] Wang people views
2 3-Tensor n R J IK A (2) : a ijk = a (2) jν, ν = k + (i 1)I, R K IJ A (3) : a ijk = a (3) kν, ν = i + (j 1)J. 1 3-Tensor illums express pixels 5- [7] [8] Sun Web [9] Kolda [10] Vasilescu HOSVD 3. Nth N Nth N-Tensor 2 2 1 Nth A A, B = b i 1 i 2 i N a i1 i 2 i N (1) i 1,,i N 2 2 0 2 A A = A, A (2) 3 A n 3rd A R I J K ( 1) R I JK A (1) : a ijk = a (1) iν, ν = j + (k 1)K, 4 A R I 1 I 2 I N U R J n I n n n ( 2) (A n U) i1 i 2 i n 1 j ni n+1 i N = a i1 i 2 i n 1 i ni n+1 i N u jn i N (3) i n (A n F) m G = (A n G) m F, m = n (4) (A n F) n G = A n (GF) (5) 1 Nth A A = S 1 U (1) 2 N U (N) (6) HOSVD [1] 1 U (m) U (n) m = n 2 S A a S in =α, S in=β = 0, α = β (7) b S in =1 > = S in =2 > = > = S in =I n > = 0 (8) 4. 4. 1 Savas Savas HOSVD [3] 5 3rd
3 A ν = S(:, :, ν) 1 U (1) 2 U (2) (9) 4 S, U (n) A HOSVD(A = S 1 U (1) 2 U (2) 3 U (3) ) A ν, A µ = 0, ν = µ (10) Savas D R(µ) µ k R(µ) = 1 D, A µ ν 2 (11) ν=1 4. 2 Savas Kohonen [11] Sun [12] ( 3) Sun ν x µ µ ν ( 4) Savas 5. Savas 5. 1 Savas Savas Algorithm1 Algorithm2 HOSVD S 1 2 U µ V µ S µ (:, :, j), S µ (:, :, k) = δ jk δ jk X X = 1 1 5. 2 Algorithm3 Algorithm4 Algorithm5 Algorithm3 1 2 2 3 2
Algorithm 1 Savas Input: α µ ν (µ) Aµ, µ = 1... M ν (µ) = 1... N (µ) K Output: S µ, µ = 1... M 2: repeat 3: A µ HOSVD B µ U (1)µ U (2)µ U (3)µ 4: S µ B µ (:, :, 1 : K) 1 U (1)µ 2 U (2)µ 5: S µ (:, :, j), S µ (:, :, k) = δ jk (δ jk :) 6: µ µ + 1 7: until µ > M 8: return S µ, µ = 1... M Algorithm 2 Savas Input: X, S µ, µ = 1... M, K Output: category 2: category 1 3: min 1 4: X X = 1 5: repeat 6: if min < 0 min > 1 K i=1 X, Sµ (:, :, i) 2 then 7: min 1 K i=1 X, Sµ (:, :, i) 2 8: category µ 9: end if 10: µ µ + 1 11: until µ > M 12: return category 4 3 5 3 6 3 6 Savas HOSVD HOSVD 6 Algorithm4 Algorithm5 6 Algorithm5 Savas X X = 1 Algorithm 3 Input: α µ ν (µ) Aµ, µ = 1... M ν (µ) = 1... N (µ) K Output: S µ, µ = 1... M 2: A µ(1) 3: 2 A µ(2) 4: 2 A µ(3) 5: 3 A µ(4) 6: 3 A µ(5) 7: 3 A µ(6) 8: repeat 9: i 1 10: repeat 11: m 1 12: repeat 13: A µ(i) m 14: C µ(i) m A µ(i) m A µ(i)t m 15: U µ(i) m S µ(i) m 16: 17: m m + 1 18: until m > 3 19: B µ(i) A µ(i) 1 U µ(i)t 1 2 U µ(i)t 2 3 U µt 3 20: if K < A µ(i) then 21: S µ B µ (:, :, 1 : K) 1 U µ 1 2 U µ 2 22: else 23: S µ B µ (:, :, :) 1 U µ 1 2 U µ 2 24: end if 25: S µ (:, :, j), S µ (:, :, k) = δ jk (δ jk :) 26: i i + 1 27: until i > 6 28: µ µ + 1 29: until µ > M 30: return S µ, µ = 1... M 6 1 6 6. Savas MATLAB
Algorithm 4 Input: S µ(i), µ = 1... M i = 1... 6 α ν ν = 1... N C µ(i) m µ = 1... M i = 1... 6 m = 1... 3 K β Output: S µ(i), µ = 1... M i = 1... 6 1: ν 1 2: repeat 3: if Algorithm5 α ν then 4: µ 5: i 1 6: repeat 7: m 1 8: repeat 9: C µ(i) m = C µ(i) m βα να T ν 10: U µ(i) m S µ(i) m 11: 12: m m + 1 13: until m > 3 14: if K < A µ(i) then 15: S µ B µ (:, :, 1 : K) 1 U µ 1 2 U µ 2 16: else 17: S µ B µ (:, :, :) 1 U µ 1 2 U µ 2 18: end if 19: S µ(i) (:, :, j), S µ(i) (:, :, k) = δ jk 20: i i + 1 21: until i < 6 22: end if (δ jk :) 23: until ν < N, N: 24: return S µ, µ = 1... M Algorithm 5 Input: X, S µ(i), µ = 1... M i = 1... 6, K Output: category 2: category 1 3: min 1 4: X X = 1 5: repeat 6: result 0 7: repeat 8: result result + (1 K i=1 X, Sµ (:, :, i) 2 ) 9: i i + 1 10: until i < 6 11: if min < 0 min > result then 12: min result 13: category µ 14: end if 15: µ µ + 1 16: until µ > M 17: return category
1 (%) Savas 8.4 4.6 4.4 Savas + 8 4.5 4.4 Savas + 8.2 4.4 4.2 8.1 4.4 4.2 5 MNIST 7 IAM Face Database ( ) 6 MNIST Intel(R)Xeon(R)CPU E5420 2.5GHz 8GBRAM Windows Vista Business β 2 6. 1 THE MNIST DATABASE of handwritten digits Web MNIST handwritten digit database 1 MNIST 28pixel 28pixel 60,000 10,000 ( 5) 0 9 10 120 1,000 1 Savas 2 Savas + 3 Savas + 4 (Savas + + ) 4 1 2 2 60 3 2 60 4 3 40 5 3 40 6 3 40 6 K 20 40 4 1 MNIST http://yann.lecun.com/exdb/mnist/ 8 ( ) 6 1 6. 2 IAM Face Database FKI Web IAM Face Database 2 IAM Face Database 512pixel 342pixel 30 5 2 10 ( 7) 96pixel 86pixel ( 8) 1 Savas 2 Savas + 2 9 1 Savas 0% 6. 3 MNIST K Savas 2 IAM Face Database http://www.iam.unibe.ch/fki/databases/iamfaces-database
1.2% 0.2% 0.3% K = 40 8% K = 36 8.1% 4.4% 4.2% Savas 0.1 0.2% K = 28 () IAM Face Database Savas 0% [6] M. A. O. Vasilescu and D. Terzopoulos: Tensortextures: Multilinear image-based rendering, ACM SIG- GRAPH 2004 Conference Los Angeles, CA, August, 2004, in Computer Graphics Proceedings, Annual Conference Series, 2004, in press (2004). [7] M. S. Bartlett, J. R. Movellan and T. J. Sejnowski: Face recognition by independent component analysis, IEEE Transactions on Neural Networks, 13, pp. 1450 1464 (2002). [8] M. A. O. Vasilescu and D. Terzopoulos: Multilinear independent components analysis, IEEE COMPUTER SO- CIETY COMPUTER VISION AND PATTERN RECOG- NITION (CVPR 05), IEEE COMPUTER SOCIETY COMPUTER VISION AND PATTERN RECOGNITION (CVPR 05), pp. 547 543 (2005). [9] J. tao Sun, H.-J. Zeng, H. Liu and Y. Lu: Cubesvd: A novel approach to personalized web search, In Proc. of the 14 th International World Wide Web Conference (WWW), Press, pp. 382 390 (2005). [10] T. Kolda and B. Bader: The tophits model for higher-order web link analysis, Workshop on Link Analysis, Counterterrorism and Security at SDM06 (2006). [11] E. Oja: Subspace Methods of Pattern Recognition, Research Studies Press Ltd. (1983). [12] S. Ning, A. Masato and N. Yoshiaki: A handwrittern character recognition system by using improved directional element feature and subspace method, IEICE technical report. Pattern recognition and understanding, pp. 33 40 (1994). 7. HOSVD Savas Savas [1] L. D. Lathauwer, B. D. Moor and J. Vandewalle: A multilinear singular value decomposition, SIAM J. Matrix Anal. Appl, 21, pp. 1253 1278 (2000). [2] C. R.Bro: Parafac. tutorial and applications., Intel. Lab. Syst., vol.38, pp.149-171 (1997). [3] B. Savas: Handwritten digit classification using higher order singular value decomposition, Pattern Recognition, 40, 3, pp. 993 1003 (2007). [4] H. Wang and N. Ahuja: Facial expression decomposition, Proceedings of the Ninth IEEE International Conference on Computer Vision, 2, p. 958 (2003). [5] M. A. O. Vasilescu and D. Terzopoulos: Multilinear subspace analysis of image ensembles, Proc. Computer Vision and Pattern Recognition Conf (2003).