Σχετικά έγγραφα
( ) 2 and compare to M.

Jordan Form of a Square Matrix

SCHOOL OF MATHEMATICAL SCIENCES G11LMA Linear Mathematics Examination Solutions

HOMEWORK 4 = G. In order to plot the stress versus the stretch we define a normalized stretch:

Chapter 6: Systems of Linear Differential. be continuous functions on the interval

CHAPTER 48 APPLICATIONS OF MATRICES AND DETERMINANTS

w o = R 1 p. (1) R = p =. = 1

Μηχανική Μάθηση Hypothesis Testing

Η ΜΕΘΟΔΟΣ PCA (Principle Component Analysis)

The Jordan Form of Complex Tridiagonal Matrices

The ε-pseudospectrum of a Matrix

Reminders: linear functions

Exercises 10. Find a fundamental matrix of the given system of equations. Also find the fundamental matrix Φ(t) satisfying Φ(0) = I. 1.

b. Use the parametrization from (a) to compute the area of S a as S a ds. Be sure to substitute for ds!

Chapter 6: Systems of Linear Differential. be continuous functions on the interval

Statistics 104: Quantitative Methods for Economics Formula and Theorem Review

= λ 1 1 e. = λ 1 =12. has the properties e 1. e 3,V(Y

Numerical Analysis FMN011

Problem Set 9 Solutions. θ + 1. θ 2 + cotθ ( ) sinθ e iφ is an eigenfunction of the ˆ L 2 operator. / θ 2. φ 2. sin 2 θ φ 2. ( ) = e iφ. = e iφ cosθ.

Section 7.6 Double and Half Angle Formulas

Lecture 2: Dirac notation and a review of linear algebra Read Sakurai chapter 1, Baym chatper 3

EE512: Error Control Coding

TMA4115 Matematikk 3

Example Sheet 3 Solutions

Principal Components Analysis - PCA

Matrices and Determinants

Phys460.nb Solution for the t-dependent Schrodinger s equation How did we find the solution? (not required)

ΣΤΑΤΙΣΤΙΚΗ ΓΝΩΣΗΣ. Φαζμαηικη Αναλςζη Σςνδιαζποπαρ. Principal Components Analysis Singular Value Decomposition Iωαννηρ Ανηωνιος Φαπαλαμπορ Μππαηζαρ

Multi-dimensional Central Limit Theorem

Mean-Variance Analysis

Section 8.3 Trigonometric Equations

2. THEORY OF EQUATIONS. PREVIOUS EAMCET Bits.

Practice Exam 2. Conceptual Questions. 1. State a Basic identity and then verify it. (a) Identity: Solution: One identity is csc(θ) = 1

6.1. Dirac Equation. Hamiltonian. Dirac Eq.

Testing for Indeterminacy: An Application to U.S. Monetary Policy. Technical Appendix

Mean bond enthalpy Standard enthalpy of formation Bond N H N N N N H O O O

Math 6 SL Probability Distributions Practice Test Mark Scheme

Finite Field Problems: Solutions

x j (t) = e λ jt v j, 1 j n

Partial Trace and Partial Transpose

Other Test Constructions: Likelihood Ratio & Bayes Tests

Φασματικη Αναλυση Συνδιασπορας

Tridiagonal matrices. Gérard MEURANT. October, 2008

Partial Differential Equations in Biology The boundary element method. March 26, 2013

DETERMINATION OF DYNAMIC CHARACTERISTICS OF A 2DOF SYSTEM. by Zoran VARGA, Ms.C.E.

Uniform Convergence of Fourier Series Michael Taylor

Multi-dimensional Central Limit Theorem

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 19/5/2007

2 Composition. Invertible Mappings

Problem Set 3: Solutions

3.4 SUM AND DIFFERENCE FORMULAS. NOTE: cos(α+β) cos α + cos β cos(α-β) cos α -cos β

SCITECH Volume 13, Issue 2 RESEARCH ORGANISATION Published online: March 29, 2018

4.6 Autoregressive Moving Average Model ARMA(1,1)

Probability and Random Processes (Part II)

Srednicki Chapter 55

Πανεπιστήµιο Κρήτης - Τµήµα Επιστήµης Υπολογιστών. ΗΥ-570: Στατιστική Επεξεργασία Σήµατος. ιδάσκων : Α. Μουχτάρης. εύτερη Σειρά Ασκήσεων.

ST5224: Advanced Statistical Theory II

Hebbian Learning. Donald Hebb ( )

6. MAXIMUM LIKELIHOOD ESTIMATION

New bounds for spherical two-distance sets and equiangular lines

A Note on Intuitionistic Fuzzy. Equivalence Relation

ENGR 691/692 Section 66 (Fall 06): Machine Learning Assigned: August 30 Homework 1: Bayesian Decision Theory (solutions) Due: September 13

Inverse trigonometric functions & General Solution of Trigonometric Equations

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 6/5/2006

CE 530 Molecular Simulation

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 24/3/2007

ΤΗΛΕΠΙΣΚΟΠΗΣΗ. Γραµµικοί Μετασχηµατισµοί (Linear Transformations) Τονισµός χαρακτηριστικών εικόνας (image enhancement)

Notes on the Open Economy

Homework 3 Solutions

ΕΛΛΗΝΙΚΗ ΔΗΜΟΚΡΑΤΙΑ ΠΑΝΕΠΙΣΤΗΜΙΟ ΚΡΗΤΗΣ

Μαθηµατικοί Υπολογισµοί στην R

Econ Spring 2004 Instructor: Prof. Kiefer Solution to Problem set # 5. γ (0)

Concrete Mathematics Exercises from 30 September 2016

CHAPTER 25 SOLVING EQUATIONS BY ITERATIVE METHODS

derivation of the Laplacian from rectangular to spherical coordinates

SOLUTIONS TO MATH38181 EXTREME VALUES AND FINANCIAL RISK EXAM

( y) Partial Differential Equations


SOLUTIONS TO MATH38181 EXTREME VALUES AND FINANCIAL RISK EXAM

Histogram list, 11 RANDOM NUMBERS & HISTOGRAMS. r : RandomReal. ri : RandomInteger. rd : RandomInteger 1, 6

Answer sheet: Third Midterm for Math 2339

Lecture 10 - Representation Theory III: Theory of Weights

Reaction of a Platinum Electrode for the Measurement of Redox Potential of Paddy Soil

Math221: HW# 1 solutions

Affine Weyl Groups. Gabriele Nebe. Summerschool GRK 1632, September Lehrstuhl D für Mathematik

Solution Series 9. i=1 x i and i=1 x i.

5.4 The Poisson Distribution.

Module 5. February 14, h 0min

Estimation for ARMA Processes with Stable Noise. Matt Calder & Richard A. Davis Colorado State University

General 2 2 PT -Symmetric Matrices and Jordan Blocks 1

Exercises to Statistics of Material Fatigue No. 5

Lecture 15 - Root System Axiomatics

Δημιουργία Λογαριασμού Διαχείρισης Business Telephony Create a Management Account for Business Telephony

F19MC2 Solutions 9 Complex Analysis

Lecture 21: Properties and robustness of LSE

Lecture 13 - Root Space Decomposition II

Απόκριση σε Μοναδιαία Ωστική Δύναμη (Unit Impulse) Απόκριση σε Δυνάμεις Αυθαίρετα Μεταβαλλόμενες με το Χρόνο. Απόστολος Σ.

1η εργασία για το μάθημα «Αναγνώριση προτύπων»

ORDINAL ARITHMETIC JULIAN J. SCHLÖDER

Queensland University of Technology Transport Data Analysis and Modeling Methodologies

Transcript:

X = [ 1 2 4 6 12 15 25 45 68 67 65 98 ] X X double[] X = { 1, 2, 4, 6, 12, 15, 25, 45, 68, 67, 65, 98 }; double X.Length double

double[] x1 = { 0, 8, 12, 20 }; double[] x2 = { 8, 9, 11, 12 }; double mean1 = x1.mean(); double mean2 = x2.mean(); double stddev1 = x1.standarddeviation(); double stddev2 = x2.standarddeviation(); double stddev1 = x1.standarddeviation(mean1); double stddev2 = x2.standarddeviation(mean2);

// Create some sets of numbers double[] x1 = { 0, 8, 12, 20 }; double[] x2 = { 8, 9, 11, 12 }; // Compute the means double mean1 = x1.mean(); double mean2 = x2.mean(); // Compute the standard deviations double stddev1 = x1.standarddeviation(mean1); double stddev2 = x2.standarddeviation(mean2); Data: x1: 0 8 12 20 x2: 8 9 11 12 Means: x1: 10 x2: 10 Standard Deviations: x1: 8.32666399786453 x2: 1.82574185835055 // Show results on screen Console.WriteLine("Data:"); Console.WriteLine(" x1: " + x1.tostring("g")); Console.WriteLine(" x2: " + x2.tostring("g")); Console.WriteLine("Means:"); Console.WriteLine(" x1: " + mean1); Console.WriteLine(" x2: " + mean2); Console.WriteLine("Standard Deviations:"); Console.WriteLine(" x1: " + stddev1); Console.WriteLine(" x2: " + stddev2);

StandardDeviation() Variance() double cov = x1.covariance(x2);

double[,] data = { // Hours (H) Mark (M) { 9, 39 }, { 15, 56 }, { 25, 93 }, { 14, 61 }, { 10, 50 }, { 18, 75 }, { 0, 32 }, { 16, 85 }, { 5, 42 }, { 19, 70 }, { 16, 66 }, { 20, 80 } }; double[,] covariancematrix = data.covariance(); ScatterplotBox.Show(data);

double[,] data = { // Hours (H) Mark (M) { 9, 39 }, { 15, 56 }, { 25, 93 }, { 14, 61 }, { 10, 50 }, { 18, 75 }, { 0, 32 }, { 16, 85 }, { 5, 42 }, { 19, 70 }, { 16, 66 }, { 20, 80 } }; // Compute total and average double[] totals = data.sum(); double[] averages = data.mean(); Data: Hours(H) Mark(M) 09 39 15 56 25 93 14 61 10 50 18 75 00 32 16 85 05 42 19 70 16 66 20 80 Sum: 167.00 749.00 Avg: 13.92 62.42 Covariance matrix: 047.72 122.95 122.95 370.08 // Compute covariance matrix double[,] C = data.covariance(); // Show results on screen Console.WriteLine("Data: "); Console.WriteLine(" Hours(H) Mark(M)"); Console.WriteLine(data.ToString(" 00")); Console.WriteLine("Sum: " + totals.tostring("000.00")); Console.WriteLine("Avg: " + averages.tostring(" 00.00")); Console.WriteLine("Covariance matrix:"); Console.WriteLine(C.ToString(" 000.00")); Console.ReadKey(); ScatterplotBox.Show(data);

// Consider the following matrix double[,] A = { { 2, 3 }, { 2, 1 } }; // Now consider the vector double[] u = { 1, 3 }; // Multiplying both, we get x = [ 11 5 ]' double[] x = A.Multiply(u); // We can not express 'x' as a multiple // of 'u', so 'u' is not an eigenvector. // However, consider now the vector double[] v = { 3, 2 }; // Multiplying both, we get y = [ 12 8 ]' double[] y = A.Multiply(v); Matrix A: 2 3 2 1 Vector u: 1 3 Vector v: 3 2 x = A*u 11 5 y = A*v 12 8 // It can be seen that 'y' can be expressed as // a multiple of 'v'. Since y = 4*v, 'v' is an // eigenvector with the associated eigenvalue 4. // Show on screen Console.WriteLine("Matrix A:"); Console.WriteLine(A.ToString(" 0")); Console.WriteLine("Vector u:"); Console.WriteLine(u.Transpose().ToString(" 0")); Console.WriteLine("Vector v:"); Console.WriteLine(v.Transpose().ToString(" 0")); Console.WriteLine("x = A*u"); Console.WriteLine(x.Transpose().ToString(" 0")); Console.WriteLine("y = A*v"); Console.WriteLine(y.Transpose().ToString(" 0"));

A n n A v n Av = λv. λ v v v A λ v V V A Λ Λ A = V Λ V A V A = V Λ V 3 2 4 M = ( 2 0 2) 4 2 3

1 3 2 4 1 1 M ( 0 ) = ( 2 0 2) ( 0 ) = ( 0 ) 1 4 2 3 1 1 M v = ( 1,0, 1) (1, 0, 1) v v (λ, v ) λ = 1 v = ( 1,0, 1) 1 Λ V

// Consider the following matrix double[,] M = { { 3, 2, 4 }, { 2, 0, 2 }, { 4, 2, 3 } }; // Create an Eigenvalue decomposition var evd = new EigenvalueDecomposition(M); // Store the eigenvalues and eigenvectors double[] λ = evd.realeigenvalues; double[,] V = evd.eigenvectors; // Reconstruct M = V*λ*V' double[,] R = V.MultiplyByDiagonal(λ).MultiplyByTranspose(V); Matrix: 3 2 4 2 0 2 4 2 3 Eigenvalues: -1.000-1.000 +8.000 Eigenvectors: -0.041-0.744 +0.667 +0.910 +0.248 +0.333-0.413 +0.620 +0.667 Reconstruction: 3 2 4 2 0 2 4 2 3 // Show on screen Console.WriteLine("Matrix: "); Console.WriteLine(M.ToString(" 0")); Console.WriteLine("Eigenvalues: "); Console.WriteLine(λ.ToString(" +0.000; -0.000;")); Console.WriteLine("Eigenvectors:"); Console.WriteLine(V.ToString(" +0.000; -0.000;")); Console.WriteLine("Reconstruction:"); Console.WriteLine(R.ToString(" 0"));

double[,] data = { { 2.5, 2.4 }, { 0.5, 0.7 }, { 2.2, 2.9 }, { 1.9, 2.2 }, { 3.1, 3.0 }, { 2.3, 2.7 }, { 2.0, 1.6 }, { 1.0, 1.1 }, { 1.5, 1.6 }, { 1.1, 0.9 } }; double[] mean = data.mean(); double[,] dataadjust = data.subtract(mean); Data = x y 2.5 2.4 0.5 0.7 2.2 2.9 1.9 2.2 3.1 3.0 2.3 2.7 2.0 1.6 1.0 1.1 1.5 1.6 1.1 0.9 dataadjust = x y 0.69 0.49-1.31-1.21 0.39 0.99 0.09 0.29 1.29 1.09 0.49 0.79 0.19-0.31-0.81-0.81-0.31-0.31-0.71-1.01

double[,] cov = dataadjust.covariance(); cov 0.6165555556 0.6154444444 cov = ( 0.6154444444 0.7165555556 ) var evd = new EigenvalueDecomposition(cov); double[] eigenvalues = evd.realeigenvalues; double[,] eigenvectors = evd.eigenvectors; // Sort eigenvalues and vectors in descending order eigenvectors = Matrix.Sort(eigenvalues, eigenvectors, new GeneralComparer(ComparerDirection.Descending, true)); eigenvalues = ( 1.2840277122 0.0490833989 ) 0.6778733985 0.7351786555 eigenvectors = ( 0.7351786555 0.6778733985 )

double[,] featurevector = eigenvectors; double[,] featurevector = eigenvectors.getcolumn(0).transpose(); double[,] finaldata = dataadjust.multiply(eigenvectors); 1 st PC 2 nd PC 0.8279701862-0.1751153070-1.7775803253 0.1428572265 0.9921974944 0.3843749889 0.2742104160 0.1304172066 1.6758014186-0.2094984613 0.9129491032 0.1752824436-0.0991094375-0.3498246981-1.1445721638 0.0464172582-0.4380461368 0.0177646297-1.2238205551-0.1626752871 1 st PC 0.8279701862-1.7775803253 0.9921974944 0.2742104160 1.6758014186 0.9129491032-0.0991094375-1.1445721638-0.4380461368-1.2238205551

// Step 1. Get some data double[,] data = { { 2.5, 2.4 }, { 0.5, 0.7 }, { 2.2, 2.9 }, { 1.9, 2.2 }, { 3.1, 3.0 }, { 2.3, 2.7 }, { 2.0, 1.6 }, { 1.0, 1.1 }, { 1.5, 1.6 }, { 1.1, 0.9 } }; // Step 2. Subtract the mean double[] mean = data.mean(); double[,] dataadjust = data.subtract(mean); // Step 3. Calculate the covariance matrix double[,] cov = dataadjust.covariance(); // Step 4. Calculate the eigenvectors and // eigenvalues of the covariance matrix var evd = new EigenvalueDecomposition(cov); double[] eigenvalues = evd.realeigenvalues; double[,] eigenvectors = evd.eigenvectors; // Step 5. Choosing components and // forming a feature vector // Sort eigenvalues and vectors in descending order eigenvectors = Matrix.Sort(eigenvalues, eigenvectors, new GeneralComparer(ComparerDirection.Descending, true)); // Select all eigenvectors double[,] featurevector = eigenvectors; // Step 6. Deriving the new data set double[,] finaldata = dataadjust.multiply(eigenvectors); Data x y ------------ 2.5 2.4 0.5 0.7 2.2 2.9 1.9 2.2 3.1 3.0 2.3 2.7 2.0 1.6 1.0 1.1 1.5 1.6 1.1 0.9 Data Adjust x y ------------ 0.69 0.49-1.31-1.21 0.39 0.99 0.09 0.29 1.29 1.09 0.49 0.79 0.19-0.31-0.81-0.81-0.31-0.31-0.71-1.01 Covariance Matrix: +0.6165555556 +0.6154444444 +0.6154444444 +0.7165555556 Eigenvalues: +1.2840277122 +0.0490833989 Eigenvectors: +0.6778733985-0.7351786555 +0.7351786555 +0.6778733985 Transformed Data x y ---------------------------- 0.8279701862-0.1751153070-1.7775803253 0.1428572265 0.9921974944 0.3843749889 0.2742104160 0.1304172066 1.6758014186-0.2094984613 0.9129491032 0.1752824436-0.0991094375-0.3498246981-1.1445721638 0.0464172582-0.4380461368 0.0177646297-1.2238205551-0.1626752871

// Show on screen Console.WriteLine("Data"); Console.WriteLine(" x y"); Console.WriteLine(" ------------"); Console.WriteLine(data.ToString(" 0.0 ")); Console.ReadKey(); Console.WriteLine("Data Adjust"); Console.WriteLine(" x y"); Console.WriteLine(" ------------"); Console.WriteLine(dataAdjust.ToString(" 0.00; -0.00;")); Console.ReadKey(); ScatterplotBox.Show("Original PCA data", data); Console.ReadKey(); Console.WriteLine("Covariance Matrix: "); Console.WriteLine(cov.ToString(" +0.0000000000; -0.0000000000;")); Console.WriteLine("Eigenvalues: "); Console.WriteLine(eigenvalues.ToString(" +0.0000000000; -0.0000000000;")); Console.WriteLine("Eigenvectors:"); Console.WriteLine(eigenvectors.ToString(" +0.0000000000; -0.0000000000;")); Console.ReadKey(); Console.WriteLine("Transformed Data"); Console.WriteLine(" x y"); Console.WriteLine(" ----------------------------"); Console.WriteLine(finalData.ToString(" 0.0000000000; -0.0000000000;")); ScatterplotBox.Show("Transformed PCA data", finaldata);

A m n A U Σ V A = U Σ V U V Σ A C = 1 n 1 A A. C A A A A A A A = V Λ V Λ = Σ Σ A A = V Λ V = V (Σ Σ) V A A

U I = U U A A = V Λ V = V Σ Σ V = V Σ I Σ V = V Σ U U Σ V A = V Σ U A = U Σ V A A (V Σ U )( U Σ V ) = A A ( U Σ V )(V Σ U ) = AA V A A U AA Σ A A AA double[,] data = { { 2.5, 2.4 }, { 0.5, 0.7 }, { 2.2, 2.9 }, { 1.9, 2.2 }, { 3.1, 3.0 }, { 2.3, 2.7 }, { 2.0, 1.6 }, { 1.0, 1.1 }, { 1.5, 1.6 }, { 1.1, 0.9 } };

double[] mean = data.mean(); double[,] dataadjust = data.subtract(mean); Data = x y 2.5 2.4 0.5 0.7 2.2 2.9 1.9 2.2 3.1 3.0 2.3 2.7 2.0 1.6 1.0 1.1 1.5 1.6 1.1 0.9 dataadjust = x y 0.69 0.49-1.31-1.21 0.39 0.99 0.09 0.29 1.29 1.09 0.49 0.79 0.19-0.31-0.81-0.81-0.31-0.31-0.71-1.01 var svd = new SingularValueDecomposition(dataAdjust); double[] singularvalues = svd.diagonal; double[,] eigenvectors = svd.rightsingularvectors; singularvalues = ( 3.3994483978 0.6646432054 ) 0.6778733985 0.7351786555 eigenvectors = ( 0.7351786555 0.6778733985 )

double[] eigenvalues = singularvalues.elementwisepower(2); A A n 1 A A eigenvalues = eigenvalues.divide(data.getlength(0) - 1); eigenvalues = ( 1.2840277122 0.0490833989 )

// Step 1. Get some data double[,] data = { { 2.5, 2.4 }, { 0.5, 0.7 }, { 2.2, 2.9 }, { 1.9, 2.2 }, { 3.1, 3.0 }, { 2.3, 2.7 }, { 2.0, 1.6 }, { 1.0, 1.1 }, { 1.5, 1.6 }, { 1.1, 0.9 } }; // Step 2. Subtract the mean double[] mean = data.mean(); double[,] dataadjust = data.subtract(mean); // Step 3. Calculate the singular values and // singular vectors of the data matrix var svd = new SingularValueDecomposition(dataAdjust); double[] singularvalues = svd.diagonal; double[,] eigenvectors = svd.rightsingularvectors; // Step 4. Calculate the eigenvalues as // the square of the singular values double[] eigenvalues = singularvalues.elementwisepower(2); // Step 5. Choosing components and // forming a feature vector // Select all eigenvectors double[,] featurevector = eigenvectors; // Step 6. Deriving the new data set double[,] finaldata = dataadjust.multiply(eigenvectors); Data x y ------------ 2.5 2.4 0.5 0.7 2.2 2.9 1.9 2.2 3.1 3.0 2.3 2.7 2.0 1.6 1.0 1.1 1.5 1.6 1.1 0.9 Data Adjust x y ------------ 0.69 0.49-1.31-1.21 0.39 0.99 0.09 0.29 1.29 1.09 0.49 0.79 0.19-0.31-0.81-0.81-0.31-0.31-0.71-1.01 Singular values: +3.3994483978 +0.6646432054 Eigenvalues: +11.5562494096 +0.4417505904 Eigenvalues (normalized): +1.2840277122 +0.0490833989 Eigenvectors: +0.6778733985-0.7351786555 +0.7351786555 +0.6778733985 Transformed Data x y ---------------------------- +0.8279701862-0.1751153070-1.7775803253 +0.1428572265 +0.9921974944 +0.3843749889 +0.2742104160 +0.1304172066 +1.6758014186-0.2094984613 +0.9129491032 +0.1752824436-0.0991094375-0.3498246981-1.1445721638 +0.0464172582-0.4380461368 +0.0177646297-1.2238205551-0.1626752871

// Show on screen. Console.WriteLine("Data"); Console.WriteLine(" x y"); Console.WriteLine(" ------------"); Console.WriteLine(data.ToString(" 0.0 ")); Console.ReadKey(); Console.WriteLine("Data Adjust"); Console.WriteLine(" x y"); Console.WriteLine(" ------------"); Console.WriteLine(dataAdjust.ToString(" 0.00; -0.00;")); Console.ReadKey(); ScatterplotBox.Show("Original PCA data", data); Console.WriteLine("Singular values: "); Console.WriteLine(singularValues.ToString(" +0.0000000000; -0.0000000000;")); Console.WriteLine("Eigenvalues: "); Console.WriteLine(eigenvalues.ToString(" +0.0000000000; -0.0000000000;")); // Normalize eigenvalues to replicate the covariance eigenvalues = eigenvalues.divide(data.getlength(0) - 1); Console.WriteLine("Eigenvalues (normalized): "); Console.WriteLine(eigenvalues.ToString(" +0.0000000000; -0.0000000000;")); Console.WriteLine("Eigenvectors:"); Console.WriteLine(eigenvectors.ToString(" +0.0000000000; -0.0000000000;")); Console.ReadKey(); Console.WriteLine("Transformed Data"); Console.WriteLine(" x y"); Console.WriteLine(" ----------------------------"); Console.WriteLine(finalData.ToString(" +0.0000000000; -0.0000000000;"));

var pca = new PrincipalComponentAnalysis(data); pca.overwrite = true; pca.method = AnalysisMethod.Standardize; pca.compute();

datagridview1.datasource = pca.components;

// Step 1. Get some data double[,] data = { { 2.5, 2.4 }, { 0.5, 0.7 }, { 2.2, 2.9 }, { 1.9, 2.2 }, { 3.1, 3.0 }, { 2.3, 2.7 }, { 2.0, 1.6 }, { 1.0, 1.1 }, { 1.5, 1.6 }, { 1.1, 0.9 } }; // Step 2. Create the Principal Component Analysis var pca = new PrincipalComponentAnalysis(data); // Step 3. Compute the analysis pca.compute(); // Step 4. Transform your data double[,] finaldata = pca.transform(data); // Show on screen Console.WriteLine("Data"); Console.WriteLine(" x y"); Console.WriteLine(" ------------"); Console.WriteLine(data.ToString(" 0.0 ")); Console.ReadKey(); ScatterplotBox.Show("Original PCA data", data); Console.WriteLine("Eigenvalues: "); Console.WriteLine(pca.Eigenvalues.ToString(" +0.0000000000; -0.0000000000;")); Console.WriteLine("Eigenvectors:"); Console.WriteLine(pca.ComponentMatrix.ToString(" +0.0000000000; -0.0000000000;")); Console.ReadKey(); Console.WriteLine("Transformed Data"); Console.WriteLine(" x y"); Console.WriteLine(" ----------------------------"); Console.WriteLine(finalData.ToString(" 0.0000000000; -0.0000000000;")); Data x y ------------ 2.5 2.4 0.5 0.7 2.2 2.9 1.9 2.2 3.1 3.0 2.3 2.7 2.0 1.6 1.0 1.1 1.5 1.6 1.1 0.9 Eigenvalues: +1.2840277122 +0.0490833989 Eigenvectors: +0.6778733985-0.7351786555 +0.7351786555 +0.6778733985 Transformed Data x y ---------------------------- 0.8279701862-0.1751153070-1.7775803253 0.1428572265 0.9921974944 0.3843749889 0.2742104160 0.1304172066 1.6758014186-0.2094984613 0.9129491032 0.1752824436-0.0991094375-0.3498246981-1.1445721638 0.0464172582-0.4380461368 0.0177646297-1.2238205551-0.1626752871