Homework for 1/27 Due 2/5

Σχετικά έγγραφα
Solutions: Homework 3

Lecture 17: Minimum Variance Unbiased (MVUB) Estimators

Last Lecture. Biostatistics Statistical Inference Lecture 19 Likelihood Ratio Test. Example of Hypothesis Testing.

LAD Estimation for Time Series Models With Finite and Infinite Variance

4.6 Autoregressive Moving Average Model ARMA(1,1)

Solution Series 9. i=1 x i and i=1 x i.

ST5224: Advanced Statistical Theory II

Other Test Constructions: Likelihood Ratio & Bayes Tests

1. For each of the following power series, find the interval of convergence and the radius of convergence:

2 Composition. Invertible Mappings

true value θ. Fisher information is meaningful for families of distribution which are regular: W (x) f(x θ)dx

CHAPTER 103 EVEN AND ODD FUNCTIONS AND HALF-RANGE FOURIER SERIES

Outline. Detection Theory. Background. Background (Cont.)

SOLUTIONS TO MATH38181 EXTREME VALUES AND FINANCIAL RISK EXAM

Econ 2110: Fall 2008 Suggested Solutions to Problem Set 8 questions or comments to Dan Fetter 1

Estimation for ARMA Processes with Stable Noise. Matt Calder & Richard A. Davis Colorado State University

IIT JEE (2013) (Trigonomtery 1) Solutions

Homework 3 Solutions

Exercises to Statistics of Material Fatigue No. 5

Statistical Inference I Locally most powerful tests

Example Sheet 3 Solutions

Parameter Estimation Fitting Probability Distributions Bayesian Approach

C.S. 430 Assignment 6, Sample Solutions

INTEGRATION OF THE NORMAL DISTRIBUTION CURVE

Statistics 104: Quantitative Methods for Economics Formula and Theorem Review

Introduction of Numerical Analysis #03 TAGAMI, Daisuke (IMI, Kyushu University)

Theorem 8 Let φ be the most powerful size α test of H

Bayesian statistics. DS GA 1002 Probability and Statistics for Data Science.


α β

Homework 8 Model Solution Section

SUPERPOSITION, MEASUREMENT, NORMALIZATION, EXPECTATION VALUES. Reading: QM course packet Ch 5 up to 5.6

Areas and Lengths in Polar Coordinates

Fourier Series. MATH 211, Calculus II. J. Robert Buchanan. Spring Department of Mathematics

Solutions to Exercise Sheet 5

n r f ( n-r ) () x g () r () x (1.1) = Σ g() x = Σ n f < -n+ r> g () r -n + r dx r dx n + ( -n,m) dx -n n+1 1 -n -1 + ( -n,n+1)

w o = R 1 p. (1) R = p =. = 1

3.4 SUM AND DIFFERENCE FORMULAS. NOTE: cos(α+β) cos α + cos β cos(α-β) cos α -cos β

Exam Statistics 6 th September 2017 Solution

6. MAXIMUM LIKELIHOOD ESTIMATION

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Section 8.3 Trigonometric Equations

Στα επόμενα θεωρούμε ότι όλα συμβαίνουν σε ένα χώρο πιθανότητας ( Ω,,P) Modes of convergence: Οι τρόποι σύγκλισης μιας ακολουθίας τ.μ.

Uniform Convergence of Fourier Series Michael Taylor

p n r

ΚΕΦΑΛΑΙΟ 2 ΑΝΑΣΚΟΠΗΣΗ ΑΠΑΡΑΙΤΗΤΩΝ ΓΝΩΣΕΩΝ: ΕΚΤΙΜΗΤΕΣ

Lecture 34 Bootstrap confidence intervals

b. Use the parametrization from (a) to compute the area of S a as S a ds. Be sure to substitute for ds!

ENGR 691/692 Section 66 (Fall 06): Machine Learning Assigned: August 30 Homework 1: Bayesian Decision Theory (solutions) Due: September 13

Second-order asymptotic comparison of the MLE and MCLE of a natural parameter for a truncated exponential family of distributions

Μηχανική Μάθηση Hypothesis Testing

Fractional Colorings and Zykov Products of graphs

Exercises 10. Find a fundamental matrix of the given system of equations. Also find the fundamental matrix Φ(t) satisfying Φ(0) = I. 1.

Lecture 3: Asymptotic Normality of M-estimators

CHAPTER 25 SOLVING EQUATIONS BY ITERATIVE METHODS

The Simply Typed Lambda Calculus

Probability theory STATISTICAL MODELING OF MULTIVARIATE EXTREMES, FMSN15/MASM23 TABLE OF FORMULÆ. Basic probability theory

Κεφάλαιο 2 ΕΚΤΙΜΗΣΗ ΠΑΡΑΜΕΤΡΩΝ. 2.1 Σηµειακή Εκτίµηση. = E(ˆθ) και διασπορά σ 2ˆθ = Var(ˆθ).

ESTIMATION OF SYSTEM RELIABILITY IN A TWO COMPONENT STRESS-STRENGTH MODELS DAVID D. HANAGAL

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Matrix Algebra

DESIGN OF MACHINERY SOLUTION MANUAL h in h 4 0.

Introduction to the ML Estimation of ARMA processes

Asymptotic distribution of MLE

Solve the difference equation

EE512: Error Control Coding

HOMEWORK 4 = G. In order to plot the stress versus the stretch we define a normalized stretch:

The Equivalence Theorem in Optimal Design

Homework 4.1 Solutions Math 5110/6830

Phys460.nb Solution for the t-dependent Schrodinger s equation How did we find the solution? (not required)

Areas and Lengths in Polar Coordinates

Aquinas College. Edexcel Mathematical formulae and statistics tables DO NOT WRITE ON THIS BOOKLET

5.4 The Poisson Distribution.

HOMEWORK#1. t E(x) = 1 λ = (b) Find the median lifetime of a randomly selected light bulb. Answer:

CHAPTER 101 FOURIER SERIES FOR PERIODIC FUNCTIONS OF PERIOD

Math221: HW# 1 solutions

6.3 Forecasting ARMA processes

The Pohozaev identity for the fractional Laplacian

Problem Set 3: Solutions

Three Classical Tests; Wald, LM(Score), and LR tests

An Inventory of Continuous Distributions

Lecture 12: Pseudo likelihood approach

Every set of first-order formulas is equivalent to an independent set

Final: May 17 (W) 6:30 pm In WH 100E Notice the time changes! Introduction to Statistics (Math 502)

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 19/5/2007

Tridiagonal matrices. Gérard MEURANT. October, 2008

Αναγνώριση Προτύπων. Non Parametric

Bessel function for complex variable

Partial Differential Equations in Biology The boundary element method. March 26, 2013

Lecture 7: Overdispersion in Poisson regression

Approximation of distance between locations on earth given by latitude and longitude

Ordinal Arithmetic: Addition, Multiplication, Exponentiation and Limit

Tired Waiting in Queues? Then get in line now to learn more about Queuing!

Overview. Transition Semantics. Configurations and the transition relation. Executions and computation

Uniform Estimates for Distributions of the Sum of i.i.d. Random Variables with Fat Tail in the Threshold Case

SOLUTIONS TO MATH38181 EXTREME VALUES AND FINANCIAL RISK EXAM

A Two-Sided Laplace Inversion Algorithm with Computable Error Bounds and Its Applications in Financial Engineering

The Heisenberg Uncertainty Principle

Potential Dividers. 46 minutes. 46 marks. Page 1 of 11

ECE Spring Prof. David R. Jackson ECE Dept. Notes 2

Finite Field Problems: Solutions

Transcript:

Name: ID: Homework for /7 Due /5. [ 8-3] I Example D of Sectio 8.4, the pdf of the populatio distributio is + αx x f(x α) =, α, otherwise ad the method of momets estimate was foud to be ˆα = 3X (where X is the sample mea of the radom sample X,..., X ). I this problem, you will cosider the samplig distributio of ˆα. (a) Show that the estimate ˆα is ubiased. (b) Fid Var[ ˆα]. [Hit: What is Var[X]?] (b) Use the cetral limit theorem to deduce a ormal approximatio to the samplig distributio of ˆα. Accordig to this approximatio, if = 5 ad α =, what is the P( ˆα >.5)? (a) We otice that Therefore, (b) First, we have ad Thus, E [ X ] = E[X ] = ( x = 4 + αx3 6 x + αx dx = ) = α 3. E[ˆα] = E [ 3X ] = 3E [ X ] = 3 α 3 = α. E[X ] = x + αx dx = ( ) x 3 = 6 + αx4 = 8 3, ( x ( ) x + αx + αx3 Var[X ] = E[X ] E[X ] = ( α ) 3 3 α =. 3 9 Var [ X ] = Var [ ] [ X i = ] Var X i = Var [X i ] = Var[X ] = 3 α 9. )

Therefore, we have Var[ˆα] = Var [ 3X ] = 9Var [ X ] = 3 α. ( α (c) Accordig to the cetral limit theorem, we have X N 3, 3 ) α, 9 approximately. Therefore, ˆα = 3X implies that ˆα N (α, 3 ) α, approximately. I the case α = ad = 5, we have ˆα N(,.), approximately. Thus P( ˆα >.5) = P(ˆα >.5) + P(ˆα <.5) ( ˆα = P >.5 ) ( ˆα + P <.5 ).... P(Z >.44) + P(Z <.44) =.749 +.749 =.498.. [ 8-53] Let X,..., X be i.i.d. uiform o [, θ]. (a) Fid the method of momets estimate of θ, ad the mea, variace, bias, ad MSE of the MME. (b) The mle of θ is ˆθ = max X i. The pdf of max i X i (How do we i fid this?) is x f(x θ) = θ < x < θ. otherwise Calculate the mea ad variace of the mle. Compare the variace, the bias, ad the mea squared error to those of the method of momets estimate. (c) Fid a modificatio of the mle that reders it ubiased.

(a) Sice we have ad the MME for θ is where X = ad Thus µ = E[X ] = θ x θ x dx = θ θ = θ, θ = µ, θ = X, X i. Furthermore, we have µ = E[X ] = θ x x3 dx = θ 3θ Var[X ] = µ µ = θ 3 θ = θ 3, ( ) θ = θ. E [ X ] = E[X ] = θ, ad Var [ X ] = Var[X ] = θ. It follows that ad E[ θ] = E [ X ] = E [ X ] = θ, Var[ θ] = Var [ X ] = 4Var [ X ] = θ 3. I particular, the MME θ is ubiased. The bias ad MSE of θ are ad (b) The mea of ˆθ is b( θ) =, MSE( θ) = Var[ θ] + b( θ) = Var[ θ] = θ 3. Sice E[ˆθ] = θ ( ) x x x + θ θ dx = ( + )θ = + θ. E[ˆθ ] = θ ( ) x x x + θ θ dx = ( + )θ = + θ, 3

the variace of ˆθ is Var[ˆθ] = E[ˆθ ] E[ˆθ] = The bias of ˆθ is ad the MSE of ˆθ is MSE(ˆθ) = Var[ˆθ] + b(ˆθ) = = ( ) θ θ + θ = + ( + ) ( + ). b(ˆθ) = E[ˆθ] θ = θ +, θ ( + )( + ). θ ( + ) ( + ) + θ ( + ) By compariso, although the MLE ˆθ is biased while the MME θ is ubiased, we see that MSE(ˆθ) < MSE( θ) whe is large. I fact, MSE(ˆθ) decreases much faster tha MSE( θ). (c) Let it follows that Thus θ is ubiased. θ = + ˆθ = + max X i, i E [ θ] [ ] + = E ˆθ = + ] [ˆθ E = θ. 4

3. [ 8-57] This problem is cocered with the estimatio of the variace of a ormal distributio with ukow mea from a sample X,..., X of i.i.d. ormal radom variables N(µ, σ ). I aswerig the followig questios, use the fact that (from Theorem B of Sectio 6.3) ( )s χ σ ad that the mea ad variace of a chi-square radom variable with r df are r ad r, respectively. (a) Which of the followig estimates is ubiased? s = (X i X) ad ˆσ = (X i X) (We discussed this i class. However, we do ot assume ormality. Whe the distributio is ot ormal, the argumet is much more complicated, as see i class. A techical detail is provided at the ed of this homework.) (b) Which of the estimates give i part (a) has the smaller MSE? (c) For what value of ρ does ρ (X i X) have the miimal MSE (as a estimate for σ )? (a) Sice ad ( )s χ, σ E[U] = r ad Var[U] = r, where U χ r, we have [ ] [ ] ( )s ( )s E = ad Var = ( ). Therefore, ad σ E [ s ] = ( ) σ σ ( ) = σ, Var [ s ] ( ) σ = ( ) = σ4 ( ). Furthermore, sice ˆσ = s, we have E[ˆσ ] = E[s ] = σ, 5

ad ( ) Var[ˆσ ] = Var[s ] = ( ) σ 4. Thus s is a ubiased estimate for σ while ˆσ is ot. (b) The bias of the two estimates are b(s ) = ad b(ˆσ ) = E[ˆσ ] σ = σ σ = σ, respectively. Thus the MSE of the two estimates are ad MSE(s ) = Var[s ] + b(s ) = MSE(ˆσ ) = Var[ˆσ ] + b(ˆσ ) = = σ 4, respectively. Sice σ4 + = σ4, ( ) σ 4 + ( ) σ 3 > ( )( ) < <, we have MSE(ˆσ ) < MSE(s ). (c) Let Y := ρ (X i X). The Y = ρ( )s. By a similar argumet as i (a), we have ad E[Y ] = ρ( )E[s ] = ρ( )σ, Var[Y ] = (ρ( )) Var[s ] = ρ ( ) σ4 = ρ ( )σ 4. Thus MSE(Y ) = Var[Y ] + b(y ) = ρ ( )σ 4 + ( ρ( )σ σ ) Sice = σ 4 [ρ ( ) + (ρ ρ ) ] f(ρ). f (ρ) = σ 4 [4ρ( ) + (ρ ρ )( )] = σ 4 [4ρ + (ρ ρ )]( ) = ( )σ 4 (ρ + ρ ), ad f (ρ) = ( )( + )σ 4 >, we see that f(ρ) achieve its miimum at ρ = +. (f ( ( + ) ) =.) 6

Name: ID: Homework for /9 Due /5. [ 8-7] Suppose that X follows a geometric distributio, P(X = k) = p( p) k ad assume X,..., X is a i.i.d. sample of size. Fid the asymptotic variace of the mle. (The momets of geometric distributio ca be foud i P7.) We have log f(x p) = log p + (X ) log( p), p log f(x p) = p X p, ad p log f(x p) = p X ( p). Therefore, the Fisher iformatio is [ ] I(p) = E log f(x p) = p = ( ) p + ( p) p = ad the asymptotic variace of the mle is I(p) ( ) p (E[X] ) ( p) ( p)p, ( p)p =.. [ 8-6] Cosider a i.i.d. sample of radom variables with desity fuctio f(x σ) = ( σ exp x ), < x <, σ >. σ Fid the asymptotic variace of the mle.

We have log f(x σ) = log log σ X σ, σ log f(x σ) = σ + X σ, E[ X ] = = σ = σ. x σ exp x σ exp ( x σ Therefore, the Fisher iformatio is [ ] I(σ) = E log f(x σ) σ ad the asymptotic variace of the mle is σ log f(x σ) = σ X σ 3, ad ( x ) x dx = ( σ σ exp x ) dx σ ) d x σ = σ ye y dy ( = σ ) σ 3 E[ X ] = σ, I(σ) = σ. 8

3. [ 8-47] The Pareto distributio has bee used i ecoomics as a model for a desity fuctio with a slowly decayig tail: f(x x, θ) = θx θ x θ, x x, θ >. Assume that x > is give ad that X, X,..., X is a i.i.d. sample. (a) Fid the method of momets estimate of θ. (b) Fid the mle of θ. (c) Fid the asymptotic variace of the mle. (a) Sice we have µ = E[X ] = x θx θ x θ dx = θx θ x = θx θ x θ θ = θx x θ, ad the MME is where X = X i. (b) The log likelihood fuctio is Thus log f(θ) = θ = µ µ x, θ = X X x, x (log θ + θ log x (θ + ) log X i ) = log θ + θ log x (θ + ) l (θ) = θ + log x l (θ) = θ <. log X i, x θ dx log X i. ad Sice l =, log X i log x 9

the mle of θ is (c) We have ˆθ =. log X i log x log f(x θ) = log θ + θ log x (θ + ) log X, θ log f(x θ) = θ + log x log X Therefore, the Fisher iformatio is [ ] I(θ) = E log f(x θ) θ ad ad the asymptotic variace of the mle is I(θ) = θ. θ log f(x θ) = θ. = ( θ ) = θ,