: Volterra. Volterra 3. (i.i.d.) Volterra cross-correlation. 2.1 Volterra f[ ] y(t) = f[t, u( )] y(t) = h 0 +

Σχετικά έγγραφα
Anomaly Detection with Neighborhood Preservation Principle

HMY 795: Αναγνώριση Προτύπων

Stabilization of stock price prediction by cross entropy optimization

HMY 795: Αναγνώριση Προτύπων

Gaussian Processes Classification Combined with Semi-supervised Kernels


: Active Learning 2017/11/12

Buried Markov Model Pairwise

3: A convolution-pooling layer in PS-CNN 1: Partially Shared Deep Neural Network 2.2 Partially Shared Convolutional Neural Network 2: A hidden layer o

Optimization, PSO) DE [1, 2, 3, 4] PSO [5, 6, 7, 8, 9, 10, 11] (P)

EM Baum-Welch. Step by Step the Baum-Welch Algorithm and its Application 2. HMM Baum-Welch. Baum-Welch. Baum-Welch Baum-Welch.

Big Data/Business Intelligence

Μεταδιδακτορικός ερευνητής, Εργάστηκα στο ερευνητικό ινστιτούτο Wellcome Trust Centre for Human Genetics και στο Τμήμα Στατιστικής του Πανε

Ενεργητική Μάθηση Με Χρήση Μηχανών ιανυσµάτων Υποστήριξης. Ανδρέας Βλάχος Πανεπιστήµιο του Εδιµβούργου

Fourier transform, STFT 5. Continuous wavelet transform, CWT STFT STFT STFT STFT [1] CWT CWT CWT STFT [2 5] CWT STFT STFT CWT CWT. Griffin [8] CWT CWT

Bayesian Discriminant Feature Selection

SocialDict. A reading support tool with prediction capability and its extension to readability measurement

MachineDancing: MikuMikuDance (MMD) *1 MMD MMD. Kinect. MachineDancing. 3 MachineDancing. 1 MachineDancing :

No. 7 Modular Machine Tool & Automatic Manufacturing Technique. Jul TH166 TG659 A

ΔΙΠΛΩΜΑΤΙΚΕΣ ΕΡΓΑΣΙΕΣ ΠΜΣ «ΠΛΗΡΟΦΟΡΙΚΗ & ΕΠΙΚΟΙΝΩΝΙΕς» OSWINDS RESEARCH GROUP

An Automatic Modulation Classifier using a Frequency Discriminator for Intelligent Software Defined Radio

Kernel Methods and their Application for Image Understanding

HOSVD. Higher Order Data Classification Method with Autocorrelation Matrix Correcting on HOSVD. Junichi MORIGAKI and Kaoru KATAYAMA

Twitter 6. DEIM Forum 2014 A Twitter,,, Wikipedia, Explicit Semantic Analysis,


{takasu, Conditional Random Field

Re-Pair n. Re-Pair. Re-Pair. Re-Pair. Re-Pair. (Re-Merge) Re-Merge. Sekine [4, 5, 8] (highly repetitive text) [2] Re-Pair. Blocked-Repair-VF [7]

ΔΙΠΛΩΜΑΤΙΚΕΣ ΕΡΓΑΣΙΕΣ ΠΜΣ «ΠΛΗΡΟΦΟΡΙΚΗ & ΕΠΙΚΟΙΝΩΝΙΕΣ» OSWINDS RESEARCH GROUP

Research on model of early2warning of enterprise crisis based on entropy

Area Location and Recognition of Video Text Based on Depth Learning Method

Ζωντανό Εργαστήριο Thessaloniki Active and Healthy Ageing Living Lab Παρακολούθηση ατόμων στο σπίτι σε πραγματικό χρόνο

[1] DNA ATM [2] c 2013 Information Processing Society of Japan. Gait motion descriptors. Osaka University 2. Drexel University a)

Automatic extraction of bibliography with machine learning

Chapter 1 Introduction to Observational Studies Part 2 Cross-Sectional Selection Bias Adjustment

th International Conference on Machine Learning and Applications. E d. h. U h h b w k. b b f d h b f. h w k by v y

y = f(x)+ffl x 2.2 x 2X f(x) x x p T (x) = 1 Z T exp( f(x)=t ) (2) x 1 exp Z T Z T = X x2x exp( f(x)=t ) (3) Z T T > 0 T 0 x p T (x) x f(x) (MAP = Max

Merging Particle Filter

ΕΥΡΕΣΗ ΤΟΥ ΔΙΑΝΥΣΜΑΤΟΣ ΘΕΣΗΣ ΚΙΝΟΥΜΕΝΟΥ ΡΟΜΠΟΤ ΜΕ ΜΟΝΟΦΘΑΛΜΟ ΣΥΣΤΗΜΑ ΟΡΑΣΗΣ

Web-based supplementary materials for Bayesian Quantile Regression for Ordinal Longitudinal Data

(Υπογραϕή) (Υπογραϕή) (Υπογραϕή)

Yahoo 2. SNS Social Networking Service [3,5,12] Copyright c by ORSJ. Unauthorized reproduction of this article is prohibited.

Εφαρμογή Υπολογιστικών Τεχνικών στην Γεωργία

2016 IEEE/ACM International Conference on Mobile Software Engineering and Systems

Contents. Preface. 4 Support Vector Machines Linearclassification SVMs separablecase... 64

Ένα µοντέλο Ισοδύναµης Χωρητικότητας για IEEE Ασύρµατα Δίκτυα. Εµµανουήλ Καφετζάκης

Εργαστήριο Ιατρικής Φυσικής

Numerical Analysis FMN011

ER-Tree (Extended R*-Tree)

MIDI [8] MIDI. [9] Hsu [1], [2] [10] Salamon [11] [5] Song [6] Sony, Minato, Tokyo , Japan a) b)

Στοιχεία εισηγητή Ημερομηνία: 10/10/2017

ΓΙΑΝΝΟΥΛΑ Σ. ΦΛΩΡΟΥ Ι ΑΚΤΟΡΑΣ ΤΟΥ ΤΜΗΜΑΤΟΣ ΕΦΑΡΜΟΣΜΕΝΗΣ ΠΛΗΡΟΦΟΡΙΚΗΣ ΤΟΥ ΠΑΝΕΠΙΣΤΗΜΙΟΥ ΜΑΚΕ ΟΝΙΑΣ ΒΙΟΓΡΑΦΙΚΟ ΣΗΜΕΙΩΜΑ

[4] 1.2 [5] Bayesian Approach min-max min-max [6] UCB(Upper Confidence Bound ) UCT [7] [1] ( ) Amazons[8] Lines of Action(LOA)[4] Winands [4] 1

Επιβλεπόμενη Μηχανική Εκμάθηση ΙI

Motion analysis and simulation of a stratospheric airship

: Monte Carlo EM 313, Louis (1982) EM, EM Newton-Raphson, /. EM, 2 Monte Carlo EM Newton-Raphson, Monte Carlo EM, Monte Carlo EM, /. 3, Monte Carlo EM

Detection and Recognition of Traffic Signal Using Machine Learning

Intelligent Prediction Method for Small2Batch Producing Quality based on Fuzzy Least Square SVM

Βιογραφικό Σημείωμα. (τελευταία ενημέρωση 20 Ιουλίου 2015) 14 Ιουλίου 1973 Αθήνα Έγγαμος

Vol.4-DCC-8 No.8 Vol.4-MUS-5 No.8 4// 3 3 Hanning (T ) 3 Hanning 3T (y(t)w(t)) dt =.5 T y (t)dt. () STRAIGHT F 3 TANDEM-STRAIGHT[] 3 F F 3 [] F []. :

2013 Aymeric Bethencourt et al., licensee Versita Sp. z o. o.

Quick algorithm f or computing core attribute

Autonomous navigation control for mobile robots based on emotion and environment cognition

Real time mobile robot control with a multiresolution map representation

CSJ. Speaker clustering based on non-negative matrix factorization using i-vector-based speaker similarity

Estimation for ARMA Processes with Stable Noise. Matt Calder & Richard A. Davis Colorado State University

Ερευνητική+Ομάδα+Τεχνολογιών+ Διαδικτύου+


Probabilistic Approach to Robust Optimization

Κβαντική Επεξεργασία Πληροφορίας

ΒΙΟΓΡΑΦΙΚΟ ΣΗΜΕΙΩΜΑ. Παναγιώτης Μερκούρης ΕΚΠΑΙΔΕΥΣΗ

Sparse Modeling and Model Selection

Συνόρθωση κατά στάδια και αναδρομικοί αλγόριθμοι βέλτιστης εκτίμησης

HMY 799 1: Αναγνώριση Συστημάτων

Research on Economics and Management

CorV CVAC. CorV TU317. 1

1 (forward modeling) 2 (data-driven modeling) e- Quest EnergyPlus DeST 1.1. {X t } ARMA. S.Sp. Pappas [4]

The Algorithm to Extract Characteristic Chord Progression Extended the Sequential Pattern Mining

ΟΙΚΟΝΟΜΙΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΑΘΗΝΩΝ

Σύντομο Βιογραφικό Σημείωμα

Nov Journal of Zhengzhou University Engineering Science Vol. 36 No FCM. A doi /j. issn

ΕΝΙΣΧΥΤΙΚΗ ΜΑΘΗΣΗ ΡΟΜΠΟΤΙΚΟΥ ΕΛΕΓΧΟΥ ΜΕΣΩ ΠΙΘΑΝΟΤΙΚΟΥ ΣΥΜΠΕΡΑΣΜΟΥ

Applying Markov Decision Processes to Role-playing Game

1530 ( ) 2014,54(12),, E (, 1, X ) [4],,, α, T α, β,, T β, c, P(T β 1 T α,α, β,c) 1 1,,X X F, X E F X E X F X F E X E 1 [1-2] , 2 : X X 1 X 2 ;

Bayesian modeling of inseparable space-time variation in disease risk

An Advanced Manipulation for Space Redundant Macro-Micro Manipulator System

Statistics 104: Quantitative Methods for Economics Formula and Theorem Review

Comparison of Discriminant Analysis in Ear Recognition


SVM. Research on ERPs feature extraction and classification

Control Theory & Applications PID (, )

HW 3 Solutions 1. a) I use the auto.arima R function to search over models using AIC and decide on an ARMA(3,1)

ΕΘΝΙΚΟ ΜΕΤΣΟΒΙΟ ΠΟΛΥΤΕΧΝΕΙΟ ΣΧΟΛΗ ΗΛΕΚΤΡΟΛΟΓΩΝ ΜΗΧΑΝΙΚΩΝ ΚΑΙ ΜΗΧΑΝΙΚΩΝ ΥΠΟΛΟΓΙΣΤΩΝ

μ μ μ μ ( ) / μ μ

Modern Bayesian Statistics Part III: high-dimensional modeling Example 3: Sparse and time-varying covariance modeling

ΟΙΚΟΝΟΜΙΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΑΘΗΝΩΝ

ΒΙΟΓΡΑΦΙΚΟ ΣΗΜΕΙΩΜΑ. Λέκτορας στο Τμήμα Οργάνωσης και Διοίκησης Επιχειρήσεων, Πανεπιστήμιο Πειραιώς, Ιανουάριος 2012-Μάρτιος 2014.

Fragility analysis for control systems

Introduction to Risk Parity and Budgeting

Toward a SPARQL Query Execution Mechanism using Dynamic Mapping Adaptation -A Preliminary Report- Takuya Adachi 1 Naoki Fukuta 2.

= f(0) + f dt. = f. O 2 (x, u) x=(x 1,x 2,,x n ) T, f(x) =(f 1 (x), f 2 (x),, f n (x)) T. f x = A = f

Transcript:

IBM 1623-14 (LAB-S7B) 4-6-1 IBM Research Tokyo, LAB-S7B, 1623-14 Shimo-Tsuruma, Yamato, Kanagawa 242 8502, Japan RCAST, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8904, Japan E-mail: goodidea@jp.ibm.com E-mail: yairi@space.rcast.u-tokyo.ac.jp,,, Volterra,,, JL 002/02/4202 0086 C 2010 SICE 1. (i.i.d.) Bishop 2) 21) 10 kernelize 2000 L 1 1 Volterra 3 4 2. : Volterra Volterra cross-correlation 31) 8), 9) 2.1 Volterra f[ ] y(t) = f[t, u( )] y(t) = h 0 + n=1 1 ds 1... ds n n! h n (s 1,..., s n )u(t s 1 )... u(t s n ) (1) u(t) Volterra 1 y(t) = h 0 + ds h 1 (s)u(t s) Volterra m u(t) m u R m 1 29) 49 7 2010 7 1

Volterra y = h 0 + n=1 i 1 =1 h n i 1,...,i n u i1 u in (2) i n =1 Volterra h n i 1,...,i n t Volterra { } D (u (t), y (t) ) u (t) R m, y (t) R 1, t = 1, 2,..., N Volterra h 0, {h 1 i 1 },..., {h n i 1,...,i n },... 2.2 Volterra Volterra m n φ 0 (u) 1 1 2 φ 1 (u) [u 1,..., u m ] R m φ 2 (u) [u 1 u 1, u 1 u 2,..., u m u m ] R m2 (3) u φ n (u) m m n Volterra y = ηn φ n (u) n=0 η 0 = h 0 φ 1 φ 2 η 1 [h 1 1, h 1 2,..., h 1 m] R m (4) η 2 [h 1 1,1, h 1 2,1,..., h 1 m,m] R m2 (5) Volterra {η n } η [ η 0, η 1, η 2,..., η ] (6) φ(u) [ 1, φ 1, φ 2,..., φ ] (7) Volterra y = η φ(u) D η 2.3 Volterra φ n φ n (u) φ n (z) = u i1 u in z i1 z in = i 1 =1 i 1 =1 i n =1 (u i1 z i1 ) (u in z in ) = ( u z ) n i n =1 φ {β 1, β 2,... R} φ(u) [β 0 φ 0 (u), β 1 φ 1 (u),...β φ (u) ] φ k(u, z) φ(u) 2 φ(z) = β ( n u z ) n n=0 {β n 2 } p 2 0 k(u, z) = ( 1 + u z ) p β n 2 = 1 n! (8) (9) k(u, z) = exp ( u z ) (10) u k(u, z) D N N N K (k(u (i), u (j) )) 2.4 D Volterra u y 2 Volterra η Ψ(η λ) 1 N N y (n) η φ(u (n) ) 2 + λ η 2 (11) n=1 2 49 7 2010 7

λ 2 L 2 λ = 0 2 η η Φα (12) N α Φ [ ] Φ φ(u (1) ), φ(u (2) ),..., φ(u (N) ) R N Ψ α Ψ(α λ) 1 N y N Kα 2 + λα Kα (13) y N [ y (1), y (2),..., y (N)] kernel ridge regression (13) α α = [K + λni N ] 1 y N I N N (12) u y = φ(u) Φα = k(u) [K + λni N ] 1 y N (14) [ ] k(u) k(u, u (1) ), k(u, u (2) ),..., k(u, u (N) ) R N k N N N Volterra K (12) η N α Representer 33) 3. Volterra N 1 DBN : (a) ( ), (b) (Dynamic Bayesian Networks: DBN) 23) (Bayesian Networks: BN) 90 BN DBN x t+1 = f(x t, u t ) + v t (15) y t = g(x t ) + w t (16) DBN 1 (a) x t, x t+1 2 3 v t, w t DBN f g 1 10), 11), 23) DBN 30), 34) 3.1 2 DBN ( u t,y t) ( x t) 3 x t u t 49 7 2010 7 3

24) (Switching Linear Dynamical System: SLDS) (SKF) 1 (b) DBN SLDS EM (Expectation-Maximization) E 2 s t ( : ) x t ( ) M E s t x t M E s t x t 25) s t Viterbi 12) SLDS 4), 7) 3.2 (15) (16) f g 1 Ghahramani Roweis 13) Radial Basis Function Networks (RBFN) EM E (Extended Kalman Smoothing) M RBFN 2 Kernel Kalman Filter (KKF) 27) EM 14) Hammerstein (Kernel Canonical Correlation Analysis: KCCA) Kawahara 17) 3 (Gaussian Process) 28) (Gaussian Process Dynamical Models: GPDM) 35) f, g f, g GP-Bayes 18) GP-ADF 5) {x t } 4 (Manifold Learning) ( 32) ) (Laplacian Eigenmap:LE) 1) 22) 4. 4.1 N N N 1 4 49 7 2010 7

16) De Moor Nyström 3), 6) Nyström K 37) Nyström 36) 4.2 Pelckmans 26) ARX u M 15) Pelckmans (11) 2 L 1 2 L 1 Lasso (least absolute shrinkage and selection operator) Kukreja 19) Lasso SN Lasso λ η i 0 y Lasso Representer 33) 4.3 (SVM) SVM SVR support vector regression SVM(SVR) Langford 20) SVM ( ) 5. 10 2) L 1 1 M. Belkin and P. Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering. In Advances in Neural Information Processing Systems 14, pages 585 591. MIT Press, 2001. 2 C. M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006. 3 K. D. Brabanter, P. Dreesen, P. Karsmakers, K. Pelckmans, J. D. Brabanter, J. Suykens, and B. D. Moor. Fixed-size LS-SVM applied to the Wiener-Hammerstein benchmark. In Proceedings of the 15th IFAC Symposium on System Identification, 2009. 4 S. Chiappa, J. Kober, and J. Peters. Using Bayesian dynamical systems for motion template libraries. In Advances in Neural Information Processing Systems 21. MIT Press, 2009. 5 M. P. Deisenroth, M. F. Huber, and U. D. Hanebeck. Analytic moment-based Gaussian process filtering. In Proceedings of the 26th Annual International Conference on Machine Learning, pages 225 232, 2009. 6 T. Falck, K. Pelckmans, J. Suykens, and B. D. Moor. Identification of Wiener-Hammerstein systems using LS-SVMs. In Proceedings of the 15th IFAC Symposium on System Identification, pages 820 825, 2009. 49 7 2010 7 5

7 E. Fox, E. Sudderth, M. Jordan, and A. Willsky. Nonparametric Bayesian learning of switching linear dynamical systems. In Advances in Neural Information Processing Systems 21. MIT Press, 2009. 8 M. O. Franz and B. Schölkopf. Implicit Wiener series for higherorder image analysis. In L. K. Saul, Y. Weiss, and L. Bottou, editors, Advances in Neural Information Processing Systems 17, pages 465 472, Cambridge, MA, 2005. MIT Press. 9 M. O. Franz and B. Schölkopf. A unifying view of Wiener and Volterra theory and polynomial kernel regression. Neural Computation, 18(12):3097 3118, 2006. 10 N. Friedman, K. Murphy, and S. Russell. Learning the structure of dynamic probabilistic networks. In Proc. Fourteenth Conference on Uncertainty in Artificial Intelligence (UAI-98), pages 139 147, 1998. 11 Z. Ghahramani. Learning dynamic Bayesian networks. In Adaptive Processing of Sequences and Data Structures (Lecture notes in artificial intelligence), pages 168 197. Springer-Verlag, 1997. 12 Z. Ghahramani and G. E. Hinton. Variational learning for switching state-space models. Neural Computation, 12:963 996, 1998. 13 Z. Ghahramani and S. Roweis. Learning nonlinear dynamical systems using an EM algorithm. In Advances in Neural Information Processing Systems 11, pages 431 437. MIT Press, 1999. 14 I. Goethals, K. Pelckmans, J. A. K. Suykens, and B. D. Moor. Subspace identification of Hammerstein systems using least squares support vector machines. IEEE Trans. on Automatic Control, 50:1509 1519, 2005. 15 T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning Data Mining, Inference, and Prediction. Springer, 2001. 16 H. Kashima, T. Idé, T. Kato, and M. Sugiyama. Recent advances and trends in large-scale kernel methods. Transactions on Information and Systems, E92-D(7):1338 1353, 2009. 17 Y. Kawahara, T. Yairi, and K. Machida. A kernel subspace method by stochastic realization for learning nonlinear dynamical systems. In B. Schölkopf, J. C. Platt, and T. Hofmann, editors, Advances in Neural Information Processing Systems 19, pages 665 672, Cambridge, MA, 2007. MIT Press. 18 J. Ko and D. Fox. Gp-bayesfilters: Bayesian filtering using Gaussian process prediction and observation models. Autonomous Robots, 27(1):75 90, 2009. 19 S. L. Kukreja, J. Löfberg, and M. J. Brenner. A least absolute shrinkage and selection operator (LASSO) for nonlinear system identification. In Proceedings of the14th IFAC Symposium on System Identification, volume 14. 20 J. Langford, R. Salakhutdinov, and T. Zhang. Learning nonlinear dynamic models. In Proceedings of the 26th Annual International Conference on Machine Learning, pages 593 600, 2009. 21 L. Ljung. Perspectives on system identification. In the IFAC Congress, 2008. 22 Z. Lu, M. Carreira-Perpinan, and C. Sminchisescu. People tracking with the Laplacian eigenmaps latent variable model. In J. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Advances in Neural Information Processing Systems 20, pages 1705 1712. MIT Press, Cambridge, MA, 2008. 23 K. P. Murphy. Dynamic Bayesian Networks: Representation, Inference and Learning. PhD thesis, UC Berkeley, 2002. 24 R. Murray-Smith and T. A. Johansen. Multiple Model Approaches to Modelling and Control. CRC Press, 1997. 25 V. Pavlovic, J. M. Rehg, and J. Maccormick. Learning switching linear models of human motion. In Advances in Neural Information Processing Systems 13, pages 981 987. The MIT Press, 2001. 26 K. Pelckmans, K. Goethals, J. Suykens, and B. D. Moor. On model complexity control in identification of Hammerstein systems. In Proceedings of IEEE Conference on Decision and Control, volume 2, pages 1203 1208, 2005. 27 L. Ralaivola and F. d Alche Buc. Time series filtering, smoothing and learning using the kernel Kalman filter. In Proc. of IEEE Int. Joint Conference on Neural Networks, pages 1449 1454, 2005. 28 C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, 2006. 29 A. G. B. Richard S. Sutton. Reinforcement Learning: An Introduction. The MIT Press, 1998. 30 J. W. Robinson and A. J. Hartemink. Non-stationary dynamic Bayesian networks. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, Advances in Neural Information Processing Systems 21, pages 1369 1376. MIT Press, 2009. 31 W. J. Rugh. Nonlinear System Theory The Volterra/Wiener Approach. The Johns Hopkins University Press, 1981. (Web version prepared in 2002.). 32 L. K. Saul, K. Q. Weinberger, J. H. Ham, F. Sha, and D. Lee. Chapter 16: Spectral methods for dimensionality reduction. In Semisupervised learning, pages 279 294. MIT Press, 2006. 33 B. Schölkopf and A. J. Smola. Learning with Kernels. The MIT Press, 2002. 34 L. Song, M. Kolar, and E. Xing. Time-varying dynamic Bayesian networks. In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, editors, Advances in Neural Information Processing Systems 22, pages 1732 1740. MIT Press, 2009. 35 J. Wang, D. Fleet, and A. Hertzmann. Gaussian process dynamical models. In Advances in Neural Information Processing Systems 18, pages 1441 1448. MIT Press, 2006. 36 C. K. I. Williams, C. E. Rasmussen, A. Schwaighofer, and V. Tresp. Observations on the Nyström method for Gaussian process prediction, http://www.dai.ed.ac.uk/homes/ckiw/online pubs.html. 2002. 37 C. K. I. Williams and M. Seeger. Using the Nyström method to speed up kernel machines. In Advances in Neural Information Processing Systems 13, pages 682 688, 2001. 2000 IBM ACM SIGKDD 1999 2006 6 49 7 2010 7