Σχετικά έγγραφα

Στοχαστικά Σήματα και Τηλεπικοινωνιές


MÉTHODES ET EXERCICES

Probability and Random Processes (Part II)

ΕΛΛΗΝΙΚΗ ΔΗΜΟΚΡΑΤΙΑ Ανώτατο Εκπαιδευτικό Ίδρυμα Πειραιά Τεχνολογικού Τομέα ΕΠΕΞΕΡΓΑΣΙΑ ΕΙΚΟΝΑΣ. Ενότητα 4: Δειγματοληψία και Κβάντιση Εικόνας

Audio Engineering Society. Convention Paper. Presented at the 120th Convention 2006 May Paris, France

Numerical Analysis FMN011

Gapso t e q u t e n t a g ebra P open parenthesis N closing parenthesis fin i s a.. pheno mno nd iscovere \ centerline

Κεφάλαιο 1 Πραγματικοί Αριθμοί 1.1 Σύνολα

Επεξεπγαζία Ήσος Φυνήρ 4 η Διάλεξη ΦΗΦΙΑΚΟ ΣΟΤΝΣΙΟ


Statistics 104: Quantitative Methods for Economics Formula and Theorem Review

Πανεπιστήμιο Πατρών Τμήμα Ηλεκτρολόγων Μηχανικών και Τεχνολογίας Υπολογιστών. Διάλεξη 5

HMY 429: Εισαγωγή στην Επεξεργασία Ψηφιακών

Sampling Basics (1B) Young Won Lim 9/21/13


Matrices and vectors. Matrix and vector. a 11 a 12 a 1n a 21 a 22 a 2n A = b 1 b 2. b m. R m n, b = = ( a ij. a m1 a m2 a mn. def

Fundamentals of Wireless Communication

Development and Verification of Multi-Level Sub- Meshing Techniques of PEEC to Model High- Speed Power and Ground Plane-Pairs of PFBS

Spectrum Representation (5A) Young Won Lim 11/3/16

Ψηφιακή Επεξεργασία Φωνής

(x y) = (X = x Y = y) = (Y = y) (x y) = f X,Y (x, y) x f X



Ψηφιακή Επεξεργασία Φωνής

HMY 799 1: Αναγνώριση Συστημάτων

ME 365: SYSTEMS, MEASUREMENTS, AND CONTROL (SMAC) I


w o = R 1 p. (1) R = p =. = 1

f a o gy s m a l nalg d co h n to h e y o m ia lalg e br coh the oogy lagebr

Π Ο Λ Ι Τ Ι Κ Α Κ Α Ι Σ Τ Ρ Α Τ Ι Ω Τ Ι Κ Α Γ Ε Γ Ο Ν Ο Τ Α


Διάλεξη 3. Δειγματοληψία και Ανακατασκευή Σημάτων. Δειγματοληψία και Ανακατασκευή Σημάτων. (Κεφ & 4.6,4.8)

Μουσική Πληροφορική. Αλέξανδρος Ελευθεριάδης Αναπ. Καθηγητής Τµήµα Πληροφορικής και Τηλεπικοινωνιών Εθνικό και Καποδιστριακό Πανεπιστήµιο Αθηνών

Ax = b. 7x = 21. x = 21 7 = 3.

Mantel & Haenzel (1959) Mantel-Haenszel

HMY 799 1: Αναγνώριση. συστημάτων. Διαλέξεις 6 7. Συνάφεια (συνέχεια) Μη παραμετρική αναγνώριση γραμμικών

Κλασσική Θεωρία Ελέγχου

HMY 220: Σήματα και Συστήματα Ι

Ó³ Ÿ , º 2(131).. 105Ä ƒ. ± Ï,.. ÊÉ ±μ,.. Šμ ² ±μ,.. Œ Ì ²μ. Ñ Ò É ÉÊÉ Ö ÒÌ ² μ, Ê


The ε-pseudospectrum of a Matrix

Ψηφιακή Επεξεργασία Σημάτων


Stationary Stochastic Processes Table of Formulas, 2017

Ψηφιακή Επεξεργασία Φωνής

Stationary Stochastic Processes Table of Formulas, 2016

ΚΕΦΑΛΑΙΟ 3 ο. Μετασχηματισμός FOURIER Διακριτού Χρόνου DTFT. (Discrete Time Fourier Transform) ΨΗΦΙΑΚΗ ΕΠΕΞΕΡΓΑΣΙΑ ΣΗΜΑΤΟΣ Σ. ΦΩΤΟΠΟΥΛΟΣ ΔΠΜΣ 1/ 45

Baseband Transmission

Ψηφιακή Επεξεργασία Φωνής

Chapter 5, 6 Multiple Random Variables ENCS Probability and Stochastic Processes

Ορίζουμε την τυπική πολυδιάστατη κανονική, σαν την κατανομή του τυχαίου (,, T ( ) μεταξύ τους ανεξάρτητα. Τότε

ΚΕΦΑΛΑΙΟ 3 ο. Μετασχηματισμός FOURIER Διακριτού Χρόνου DTFT. (Discrete Time Fourier Transform) ΨΗΦΙΑΚΗ ΕΠΕΞΕΡΓΑΣΙΑ ΣΗΜΑΤΟΣ Σ. ΦΩΤΟΠΟΥΛΟΣ ΔΠΜΣ 1 / 55

HMY 799 1: Αναγνώριση Συστημάτων

Εισαγωγή. Διάλεξη 1. Εισαγωγή Σήματα και Συστήματα Διακριτού Χρόνου. Τι είναι σήμα; Παραδείγματα

Solution Series 9. i=1 x i and i=1 x i.

HMY 799 1: Αναγνώριση Συστημάτων

F (x) = kx. F (x )dx. F = kx. U(x) = U(0) kx2


Technical Specifications

Modern Bayesian Statistics Part III: high-dimensional modeling Example 3: Sparse and time-varying covariance modeling

ΗΜΜΥ 100 Εισαγωγή στην Τεχνολογία


Ακαδηµαϊκό Έτος , Εαρινό Εξάµηνο ιδάσκων Καθ.: Νίκος Τσαπατσούλης


Διάλεξη 6. Fourier Ανάλυση Σημάτων. (Επανάληψη Κεφ Κεφ. 10.3, ) Ανάλυση σημάτων. Τι πρέπει να προσέξουμε

= 0.927rad, t = 1.16ms

Discontinuous Hermite Collocation and Diagonally Implicit RK3 for a Brain Tumour Invasion Model

DERIVATION OF MILES EQUATION FOR AN APPLIED FORCE Revision C

r r t r r t t r t P s r t r P s r s r r rs tr t r r t s ss r P s s t r t t tr r r t t r t r r t t s r t rr t Ü rs t 3 r r r 3 rträ 3 röÿ r t

= df. f (n) (x) = dn f dx n

ΘΕΩΡΙΑ ΣΗΜΑΤΩΝ & ΣΥΣΤΗΜΑΤΩΝ. Γεράσιµος Ποταµιάνος. Αναπλ. Καθηγητής, Τµήµα Ηλεκτρολόγων Μηχανικών και Μηχανικών Υπολογιστών. Πανεπιστήµιο Θεσσαλίας


Review-2 and Practice problems. sin 2 (x) cos 2 (x)(sin(x)dx) (1 cos 2 (x)) cos 2 (x)(sin(x)dx) let u = cos(x), du = sin(x)dx. = (1 u 2 )u 2 ( du)

FORMULAS FOR STATISTICS 1

Assignment 1 Solutions Complex Sinusoids

ITU-R BT ITU-R BT ( ) ITU-T J.61 (

LTI Systems (1A) Young Won Lim 3/21/15

Supplementary Materials for Evolutionary Multiobjective Optimization Based Multimodal Optimization: Fitness Landscape Approximation and Peak Detection

())*+,-./0-1+*)*2, *67()(,01-+4(-8 9 0:,*2./0 30 ;+-7 3* *),+*< 7+)0 3* (=24(-) 04(-() 18(4-3-) 3-2(>*+)(3-3*

HMY 429: Εισαγωγή στην Επεξεργασία Ψηφιακών

ΠΑΝΕΠΙΣΤΗΜΙΟ ΠΕΛΟΠΟΝΝΗΣΟΥ


Déformation et quantification par groupoïde des variétés toriques

ΚΕΦΑΛΑΙΟ 3 ο. Μετασχηματισμός FOURIER Διακριτού Χρόνου DTFT

4. Απαγορεύεται η χρήση υπολογιστή χειρός. Απαγορεύεται η χρήση κινητού, και ως υπολογιστή χειρός.

.. ntsets ofa.. d ffeom.. orp ism.. na s.. m ooth.. man iod period I n open square. n t s e t s ofa \quad d ffeom \quad orp ism \quad na s \quad m o


ECE 308 SIGNALS AND SYSTEMS FALL 2017 Answers to selected problems on prior years examinations

Οδηγός λύσης για το θέμα 2

Asymptotic distribution of MLE

A Lambda Model Characterizing Computational Behaviours of Terms

Αλγοριθμική ασυμπτωτική ανάλυση πεπερασμένης αργής πολλαπλότητας: O ελκυστής Rössler

Οδηγός λύσης θέματος 2

m i N 1 F i = j i F ij + F x

C F E E E F FF E F B F F A EA C AEC

ΚΕΦΑΛΑΙΟ 3 ο. Μετασχηματισμός FOURIER Διακριτού Χρόνου DTFT

ΕΣΔ 200: Δημιουργία Περιεχομένου ΙΙ. ΕΣΔ200 Δημιουργία Περιεχομένου ΙI. Ψηφιακός ήχος. ΕΣΔ 200: Δημιουργία Περιεχομένου ΙΙ. Βιβλιογραφία - Περιεχόμενα

Nachrichtentechnik I WS 2005/2006

Transcript:

l 0 l 2 l 1 l 1 l 1

l 2 l 2 l 1 l p λ λ µ

R N l 2 R N l 2 2 = N x i l p p R N l p N p = ( x i p ) 1 p i=1 l 2 l p p = 2 l p l 1 R N l 1 i=1 x 2 i 1 = N x i i=1 l p p p R N l 0 0 = {i x i 0} R N s s 0 s

R N Φ N N R N R N s Φ = Φ s Φ Ψ M R N Ψ M N M < N = Ψ = Ψ = (ΨΦ) = A A Ψ ΨΦ R M = A +

A A δ s s δ s s (1 δ s ) 2 2 A 2 2 (1 + δ s) 2 2 δ s s l 0 l 0 l 0 ˆ = 0 : = A R N l 0 1 2 A( 1 2 ) = A 1 A 2 = = 0

l 0 s l 0 l 0 1 2 0 1 0 + 2 0 = 2s = 0 l 0 2s 0 : A = 0 0 > 2s 2s A A (A) = 0 : A = 0 A 1 0 0 1 A ex = 0 1 0 0 0 0 1 0 (A ex ) = 3 (A ex ) = 2 (A ex ) R N s A (A) > 2s = A 0 : = A x

N, M A R M M M + 1 (A) M + 1

A f E ( ) = 1 ( 2πσe) 2 M ( 2 2 ) 2σ 2 e A

A f Y ( ) = 1 ( 2πσe) ( A 2 2 ) 2 M 2σ 2 e f Y ( ) ˆ = A 2 2 R N l 2 A T Aˆ = A T A T A ˆ = (A T A) 1 A T M < N = A

700 600 500 400 Fitting a quadratic model a =2,b = 3,c =0 Measuruments Real underlying model Fitted 1 st order model Fitted 2 nd order model Fitted 15 th order model 300 y 200 100 0 100 200 0 5 10 15 20 x y = ax 2 + bx + c

P ( ) = P (, ) P ( ) = P ( )P ( ) P ( ) P ( ) ˆ = P ( )P ( ) R N P ( ) P ( ) P ( ) l 2 l 2 P ( ) = δ( A )

ˆ = R N 1 δ( A ) ( 2πσ x ) M ( 2σx 2 ) = A l 2 ˆ = R N 2 : = A = A M < N 2 2 ˆ = A T (AA T ) 1 ˆ = R N 2 A 2 2σe 2 + 2 2 2σx 2 ˆ = A 2 2 + λ 2 2 R N λ = σ2 e σ 2 x

Impulse Response Estimation 1.2 1.0 0.8 0.6 0.4 0.2 0.0 Least square norm fitting of sparse system 20 Meas. 40 Meas. 60 Meas. 0.2 0 10 20 30 40 50 60 70 n(time)

λ λ λ λ λ ˆ = (A T A + λi N ) 1 A T I n N N l 2

l 2 l 2 l 2 a, b 1 = [a + b, 0] T 2 = [a, b] T 1 2 2 2 2 2 = 2ab l 2 l 2 2 1 l 2 X x λx P (X > x) = λ X 1,... X n N P ( X i > x) P ( X i > x) x i=1...n i=1

x X Y = (X) f(x) = 1 xσ 2π (( x µ)2 2σ 2 ) ν 1 t2 ν+1 f T (t) = νb( 1 2, ν + ) 2 2 )(1 ν B(x, y) l 1 l 2 l 1 x R f(x; µ, b) = 1 µ ( x ) 2b b l 1 R N b µ f( ; µ, b) = ( 1 2b )N ( µ 1 ) b

Probability density 0.40 0.35 0.30 0.25 0.20 0.15 Comparisson Student with Normal Student n =2 Studnet n =5 Student n =10 Normal 0.10 0.05 0.00 10 5 0 5 10 x n ˆ = 1 : = A R N ˆ = A 2 2 + λ 1 R N λ = 2σ2 e b

0.7 0.6 Comparisson Laplace with Normal Laplace b = Normal 2 2 0.5 Probability density 0.4 0.3 0.2 0.1 0.0 10 5 0 5 10 x b = 2 2 b

l 1 l 2 l 1 R 2 = A A y = A 11 x 1 + A 12 x 2 y x 1 x 2 2 = c 1 = c c c 2c c c l 2 c l 1 A A l p p 1 p p

2.0 l 2 norm minimization 2.0 l 1 norm minimization 1.5 1.5 1.0 1.0 0.5 0.5 x 2 x 2 0.0 0.0 0.5 0.5 1.0 1.0 1.0 0.5 0.0 0.5 1.0 x 1 1.0 0.5 0.0 0.5 1.0 x 1 x 2 = 5 5 x 2 = 1 x 1 2 x 1, x 2 y = x 1 + 2x 2 = 1 1 = 1 2

l 2 l p p > 1 p p p = 0 1.0 l p norm comparison 0.5 x 2 0.0 p =0.5 p =1 p =2 0.5 1.0 1.0 0.5 0.0 0.5 1.0 x 1 0.5 = 1 1 = 1 2 = 1 p = 0.5 l 0.5

l 1 l 2 l 0 l 1 l 0 l 2 l 0 A M N i i µ(a) A µ(a) = 1 i<j N T i j i 2 j 2 A (A) (A) 1 + 1 µ(a) l 0

R N 0 < 1 2 (1 + 1 µ(a) ) = A = A 0 < 1 2 (1 + 1 µ(a) ) 1 2 A l 0 l 0 l 1 1 2 A 1 A 2 1 2 s 2s (1 δ 2s ) 1 2 2 2 A( 1 2 ) 2 2 (1 + δ 2s) 1 2 2 2 δ 2s s A

l 1 δ 2s < 2 1 s l 0 l 0 δ 2s < 1 A N (0, 1 M ) R M s δ s M Cs (N/s) C δ s A

l 2 u i N u i : i=1 = A N (0, 1) N (0, 1 M ) = A ( ) ( ) = N 0 N

x 1.0 Perfect reconstruction with basis pursuit 0.8 0.6 0.4 0.2 0.0 0.2 0 20 40 60 80 100 n y 3 Samples generated by random gaussian matrix 2 1 0 1 2 3 0 2 4 6 8 10 12 14 n

SNR vs sparsity with varying number of measurements SNR 700 600 500 400 300 Noiseless SNR 50 40 30 20 SNR 20db 20 40 60 70 80 90 100 200 10 100 0 0 100 0.0 0.2 0.4 0.6 0.8 1.0 sparsity 10 0.0 0.2 0.4 0.6 0.8 1.0 sparsity A M

A l 2 ˆ = 1 : A 2 2 e2 R N

λ e λ e Aˆ e l 1 ˆ = A 2 2 : 1 l R N l l 1 e l 2 λ s s δ 2s < 2 1 2 C 0 s 1 2 s 1 + C 1 e s

SNR vs sparsity with varying number of measurements SNR 50 40 30 20 SNR 20db SNR 30 25 20 15 10 SNR 10db 20 40 60 70 80 90 100 10 5 0 0 10 0.0 0.2 0.4 0.6 0.8 1.0 sparsity 5 0.0 0.2 0.4 0.6 0.8 1.0 sparsity A M

l 1 l 0 l 1 l 2 l 0 A

= A A T = A T A A T 0 = 0 = i i k k i j i = T k i 1 1 k N i = i 1 ( T j i i 1 ) ji j i i = i 1 + T j i i 1 δ N (j i ) δ N (j i ) R N j i A i 1 = ( T ) +

T = 0 A S 0 = i S i = S i {j i } A Si A S i ˆ i = A Si 2 R i i i = A i

l 1 = A 0 < 1 2 (1 + 1 µ(a) )

SNR vs sparsity with varying number of measurements SNR 800 700 600 500 400 300 Noiseless SNR 50 40 30 20 10 SNR 20db 20 40 60 70 80 90 100 200 0 100 0 10 100 0.0 0.2 0.4 0.6 0.8 1.0 sparsity 20 0.0 0.2 0.4 0.6 0.8 1.0 sparsity A M

t t A T i 1 s s i S i i 1 t A T i 1 A Si ˆ i i = H s (ˆ i ) H s s i = A i

H s = [1, 8, 9, 0] T H 2 ( ) = [0, 8, 9, 0] T t t = s t = 2s s t = s s t = s

SNR vs sparsity with varying number of measurements SNR 800 700 600 500 400 300 Noiseless SNR 120 100 80 60 40 SNR 20db 20 40 60 70 80 90 100 200 20 100 0 0 20 100 0.0 0.2 0.4 0.6 0.8 1.0 sparsity 40 0.0 0.2 0.4 0.6 0.8 1.0 sparsity A M

g(x) = 0 f(x) = x f x i+1 = f(x i ) f = A i i i i = i + µa T i µ i = A i = A 1 = µa T = µa T (A )

1 µ µ T i+1 = T ( i + µa T i ) i+1 = T i ( i + µa T i ) µ i+1 = T i ( i + µ i A T i ) l 0 H s µ A 2 < 1 A T A µ

SNR vs sparsity with varying number of measurements SNR 400 350 300 250 200 150 Noiseless SNR 160 140 120 100 80 60 SNR 20db 20 40 60 70 80 90 100 100 40 50 20 0 0 50 0.0 0.2 0.4 0.6 0.8 1.0 sparsity 20 0.0 0.2 0.4 0.6 0.8 1.0 sparsity M

α α α α S α α Soft vs Hard Thresholding 0.6 Soft Thresholding 1.0 Hard Thresholding 0.4 0.5 0.2 S 0.5 0.0 H 0.5 0.0 0.2 0.5 0.4 0.6 1.0 0.5 0.0 0.5 1.0 input 1.0 1.0 0.5 0.0 0.5 1.0 input i+1 = S λµ ( i + µa T i )

µ A T A i = S λµ ( i + µa T ( A i )) 0 = 0 i+1 = i + t i 1 t i+1 ( i i 1 ) t i t i+1 = 1 + 1 + 4t 2 i 2 t 1 = 1

(n) = [h 0 (n),..., h p 1 (n)] p x(n) y(n) d(n) = y(n) + v(n) v(n) (n) ˆ x(n) d(n) = p (n) = [x(n),..., x(n p + 1)] T y(n) = T (n) (n) (n) ˆ (n) ˆ (n)

ˆ (n 1) ŷ(n) ŷ(n) = ˆ T (n 1) (n) e(n) = d(n) ŷ(n) = d(n) ˆ T (n 1) (n) J(n) = n λ n i e 2 (i) i=0 λ (0, 1] λ λ λ = 1 R xx (n)ˆ (n) = dx (n) R xx (n) λ R xx (n) = n λ n i (i) T (i) i=0

(n) x(n) d(n) dx (n) = n λ n i (i)d(i) i=0 R xx (n) dx R xx (n) = λr xx (n 1) + (n) T (n) dx (n) = λ dx (n 1) + (n)d(n) O(p 3 ) p P (n) = R xx (n) 1 e(n) = d(n) T (n 1) (n) (n) = P (n 1) (n) λ + T (n)p (n 1) (n) P (n) = λ 1 (P (n 1) (n) T (n)p (n 1)) ˆ (n) = ˆ (n 1) + e(n) (n) O(p 2 ) ˆ = P (0) = δ 1 I p δ I p p p

λ λ v(n) λ λ λ λ λ J(n) = E{e 2 (n)} J(n) µ ˆ (n) = ˆ (n 1) µ 2 ˆ J(n)

Error to Signal Ratio 7 6 5 4 3 2 Impulse Response estimation under noise λ =0.10 λ =0.50 λ =1.00 1 0 0 50 100 150 200 250 n(iterations) λ [1, 2, 3, 4] λ λ λ

1.2 1.0 Impulse Response estimation of changing system λ =0.10 λ =0.50 λ =1.00 Error to Signal Ratio 0.8 0.6 0.4 0.2 0.0 0 50 100 150 200 250 n(iterations) λ [1, 2, 3, 4] T [ 3, 4, 7, 8] T λ = 1

ˆ (n) = ˆ (n 1) + µe{ (n)e(n)} E{ (n)e(n)} ν Ê{ (n)e(n)} = 1 ν 1 (n i)e(n i) ν i=0 ν ν ν = 1 ˆ (n) = ˆ (n 1) + µ (n)e(n) O(p) µ λ i R xx 2 0 < µ < i λ i ˆ (n) = ˆ (n 1) + µ (n)e(n) (n) T (n)

Error to Signal Ratio 1.2 1.0 0.8 0.6 0.4 Impulse Response estimation µ =0.10 µ =1.00 µ =1.50 µ =1.99 µ =2.00 0.2 0.0 0 50 100 150 200 250 n(iterations) [1, 2, 3, 4] T µ = 1 µ = 2 µ = 1 µ

ν i Ji n(ˆ ) ˆ i N i ψ i (n) = ˆ i (n 1) µ i c l,i ˆ J l n (ˆ l (n 1)) l N i µ i c l,i l i l i N i c l,i = 1 l N i µ i i ˆ i (n) = l Ni a l,i ψ i (n) a l,i l i c l,i

a l,i = 1 l N i c i,i = 1 a i,i = 1

y(n) θ p( ) = p( θ)p(θ )dθ p(θ )

α α α A s (n) R s (n) R(n) A s (n 1) R s (n) = R(n) + A s (n 1) A s α λ

b (n) s s H s (n) = ˆ (n) + e(n) (n) T (n) (n) H s (n) ˆ (n + 1) = H s ( (n))

5 0-5 RLS GARLS ASVB-S ASVB-L ASVB-mpL NMSE(dB) -10-15 -20-25 -30 0 200 400 600 800 1000 Iterations (n) λ

35 30 25 20 SNR(db) 15 10 5 0 Sparsity aware NLMS NLMS -5 0 200 400 600 800 1000 1200 1400 1600 1800 2000 Iterations

ˆ (n) n s q = { h i h i 0} ĥ(n) h 2 2 q2 2 n 0 N (H s (ˆ (n))) = ( ) n n 0 n 0 N (H s (ˆ (n)) = ( ) n n 0 ( ) (H s (ˆ (n)) z ( ) h z q q (H s (ˆ (n)) ϕ(n) > ĥz(n) ϕ(n) s ˆ (n) (H s (ˆ (n)) ( ) r ( ) h r = 0 (H s (ˆ (n)) ĥr(n) ϕ(n) ĥr(n) > ĥz(n) ϵ ĥ r (n) 2 = ĥz(n) 2 + ϵ 2 g(n) = ˆ (n) 2 2 g(n) ĥr(n) h r 2 + ĥz(n) h z 2 h r = 0 ĥ r (n) 2 = ĥz(n) 2 + ϵ 2 g(n) 2ĥz(n) 2 2ĥz(n)h z + h 2 z + ϵ 2 f(x) = 2x 2 2h z x+(h 2 z+ϵ 2 ) f = 0 x = h z/2 x 2 h z 2 h z 2 + h 2 z + ϵ 2 = ϵ 2 + hz2 2 n g(n) ϵ 2 + h z 2 2 > h 2 z 2 q2 2

g(n) = ĥ(n) h 2 2 q2 2 ˆ (n) ϕ(n) (H s (ˆ (n)) s (H s (ˆ (n))) ( ) r, z r z (H s (ˆ (n))) ( ) ( ) (H s (ˆ (n))) ξ(n) ϕ(n) s k (n) k > s ˆ (n) n s q = { h i h i 0} ĥ(n) h 2 2 q 2 (1 1 τ+2 ) τ = k s n 0 N (H k (ˆ (n))) ( ) n n 0

35 30 25 SNR(db) 20 15 10 5 Sparsity aware NLMS NLMS 0 0 200 400 600 800 1000 1200 1400 1600 1800 2000 Iterations

n 0 N (H k (ˆ (n)) ( ) n n 0 ( ) (H k (ˆ (n)) z ( ) h z q q (H k (ˆ (n)) ξ(n) > ĥz(n) τ + 1 (H k (ˆ (n)) ( ) r i ( ) h ri = 0 (H k (ˆ (n)) ĥr i (n) ξ(n) ĥr i (n) ĥz(n) ϵ i ĥ ri (n) 2 = ĥz(n) 2 + ϵ 2 i g(n) = ˆ (n) 2 2 τ + 2 τ+1 g(n) ĥr i (n) h ri 2 + ĥz(n) h z 2 i=1 h ri = 0 ĥ ri (n) 2 = ĥz(n) 2 + ϵ 2 i ϵ2 t = τ+1 i=1 ϵ2 i g(n) (τ + 2)ĥz(n) 2 2ĥz(n)h z + h z 2 + ϵ 2 t f(x) = (τ +2)x 2 2h z x+(h 2 z +ϵ 2 t ) f x = 0 x = h z/(τ + 2) h z 2 τ+2 2 h z 2 τ+2 + h z 2 + ϵ 2 t = ϵ 2 t + h 2 z (1 1 τ+2 ) n g(n) ϵ 2 t + h z 2 (1 1 τ + 2 ) > h z 2 (1 1 τ + 2 ) q2 (1 1 τ + 2 ) g(n) = ĥ(n) h 2 2 q 2 (1 1 τ+2 )

25 20 SNR(db) 15 10 5 Sparsity aware NLMS NLMS 0 0 200 400 600 800 1000 1200 1400 1600 1800 2000 Iterations

ν a i,j i j a i,i a i,j = a ν 1 a i,i = 1 a i j a x j (n) j n j (n) = [x j (n),..., x j (n p+1)] T j d j (n) e j (n) j (n) = ˆ j (n) + e j(n) j (n) T j (n) j(n) q j j (n) = (1 a) j (n) + a q(n) ν 1 ˆ j (n + 1) = H s ( j (n)) a

α = 0.4 α = 0.4

40 35 30 25 SNR(db) 20 15 10 5 Sparsity aware NLMS NLMS 0 0 200 400 600 800 1000 1200 1400 1600 1800 2000 Iterations α

x(t) t x j (t) j t x j (t) = x(t t dj ) t > t dj t dj j x j (t) = x(t t dj ) + w j (t) t > t dj i j x i (t) = x j (t t di,j ) + w i,j (t) t di,j w i,j t di,j = t di t dj

d i i d i d j = t di,j c c cos(θ i,j ) θ i,j cos(θ i,j ) = t d i,j c d i,j di, j t = t di,j i j

i j i,j N h j = X i i,j j h j X i i Q Q j = QX i i,j Q QX i X i ˆ i,j = Q j QX i 2 2 + λ 1 R N h ˆ i,j

Delay estimation via sparsity 0.5 Whole estimation 0.5 Zoomed in 0.4 0.4 0.3 0.3 0.2 ĥ ĥ 0.2 0.1 0.0 0.1 0.1 0.0 0.2 2000 1000 0 1000 2000 Time delay(samples) 0.1 60 40 20 0 20 40 60 Time delay(samples) N h = 4097

1.6 Zoomed in 1.4 1.2 1.0 0.8 ĥ 0.6 0.4 0.2 0.0 0.2 60 40 20 0 20 40 60 Time delay(samples) ω 0 F ω ( i )(ω 0 ) = e jω 0t di F ω ( )(ω 0 ) F ω θ

ˆ t di = t d0 + t di,0 = t d0 + d i,0cos(θ) c F ω ( i )(ω 0 ) = α i (θ)f ω ( )(ω 0 ) α i (θ) θ N s sk θ k N s F ω ( i )(ω 0 ) = α i (θ k )F ω ( sk )(ω 0 ) k=1 F ω ( sk )(ω 0 ) (ω0 ) α i (θ k ) α i F ω ( i )(ω 0 ) = α T i (ω0 ) F ω ( i )(ω 0 ) (ω0 ) α i A (ω0 ) (ω0 ) = A (ω0 ) (ω0 )

N ω (ωl ) (ωl ) A (ωl ) A A = A (ω0 ) A (ω1 ) = A... A (ωnω ) N θ θ k N ω ω l N θ N ω N ω θ k (k) (k) (k) 2 l 1 ˆ = C NωN θ N θ A 2 2 + λ (k) 2 2 k=1

0 Angle of arrival detection via sparsity 50 100 Normalized Power(db) 150 200 250 300 350 400 450 0 20 40 60 80 100 120 140 160 180 θ(angle)

= Φ Φ

U y = U = (UΦ) σ σ = 1

500 Original Spectrum 500 Estimated Spectrum 450 450 400 400 350 350 Amplitude 300 250 200 Amplitude 300 250 200 150 100 50 0 0 2 4 6 8 Frequency in Hz 10 6 150 100 50 0 0 2 4 6 8 Frequency in Hz 10 6

200 Original Spectrum 200 Estimated Spectrum 180 180 160 160 140 140 Amplitude 120 100 80 Amplitude 120 100 80 60 40 20 0 0 2 4 6 8 Frequency in Hz 10 6 60 40 20 0 0 2 4 6 8 Frequency in Hz 10 6

200 Original Spectrum 140 Estimated Spectrum 180 160 120 140 100 Amplitude 120 100 80 Amplitude 80 60 60 40 40 20 20 0 0 2 4 6 8 Frequency in Hz 10 6 0 0 2 4 6 8 Frequency in Hz 10 6

l 0 l 2 l 1 l 1 l 0 l 1