Statistical Inference I Locally most powerful tests Shirsendu Mukherjee Department of Statistics, Asutosh College, Kolkata, India. shirsendu st@yahoo.co.in So far we have treated the testing of one-sided and two-sided problems for a real parameter when the distribution of the random observable X is sufficiently well behaved, that is, a one-parameter exponential family or a family with monotone likelihood ratio. In this module we consider the problem of finding optimal tests for distributions without a monotone likelihood ratio and which does not belong to one-parameter exponential family. Among the various other methods of available in the literature we are going to discuss the locally best tests of Neyman and Pearson. One sided tests Let X be a random variable with p.m.f/p.d.f. p θ (x), θ Θ R. Consider the problem of testing, H : θ = θ 0 vs. H A : θ > θ 0 Definition. A test ϕ 0 is called a Locally most powerful (LMP) test at size α for testing, H : θ = θ 0 vs. H A : θ > θ 0 if for some ɛ > 0, (i) (ii) β ϕ0 (θ 0 ) = α β ϕ0 (θ) β ϕ (θ) for all θ (θ 0, θ 0 + ɛ) and whatever ϕ satisfying (i) where β ϕ (θ) = E θ ϕ denotes the power function of the testϕ. By Mean value theorem β ϕ (θ) = β ϕ (θ 0 ) + (θ θ 0 )β ϕ(θ ), θ 0 < θ < θ.
Assumptions (i) β ϕ (θ) is continuously differentiable in the neighborhood of θ 0 for every ϕ. (ii) β ϕ(θ) = ϕ p θ(x) dx. Our problem is to minimize β ϕ(θ 0 ) subject to, β ϕ (θ 0 ) = ϕp θ0 (x)dx = α. Using generalized NP Lemma we get the optimum choice of ϕ as, ϕ 0 = if p θ(x) > kp θ0 (x) 0 = γ(x) if p θ(x) = kp θ0 (x) 0 = 0 if p θ(x) < kp θ0 (x) 0 ϕ 0 = if log p θ(x) 0 where, k and γ are such that, E θ0 ϕ = α. > k = γ(x) if log p θ(x) 0 = 0 if log p θ(x) 0 Example. Let X, X 2..., X n Cauchy(θ,). The joint density is given by, Consider the problem of testing, p θ (x) = π n n < k = k ( + (x i θ) 2 ). H 0 : θ = θ 0 vs. H : θ > θ 0 LMP test given by, ϕ 0 (x) = if log p θ(x) > k 0 = 0 if log p θ(x) < k. 0 2
Therefore, 0 = ( [ ]) const. log 0 + (x i θ) 2 [ ] 2(xi θ) =. + (x i θ) 2 ϕ 0 (x) = if 2 0 if 2 [ [ x i ] + x 2 i ] x i + x 2 i under H : θ = 0 i.e., H : X C(0, ). So, ( ) 2X E = 0 + X 2 and, V ( ) 2X = + X 2 2, > c < c. since, X has a symmetric distribution about 0. So, by CLT we have, 2 n X i +X 2 i n/2 N(0, ). Hence for large enough n, we can choose, c = τ α n/2. { Drawback. If α < /2, for c > 0 x; 2 } n x i > c is a bounded set i.e., +x 2 i { } { } x i x; 2 > c x; x 2 + x 2 i < R 2, 0 < R 2 <. i Therefore, the power function, ] X i P θ [2 > c + Xi 2 n x2 i <R2 π = x <R R θ R θ π n π n ( + (x i θ) 2 ) dx ( + (x θ) 2 ) dx since dz 0 as θ. ( + z 2 ) 3 x 2 i < R 2 x < R
Thus the power goes to 0 as θ. Hence ϕ 0 (x) is good at detecting small departures from the null hypothesis, it is unsuccessful in detecting sufficiently large ones. Two sided tests The notion of a locally best test may be extended to testing the hypothesis H : θ = θ 0 vs. H A : θ θ 0 by specifying the slope of the power function at θ 0 and requiring the second derivative of the power function at θ 0 to be a maximum. Hence we consider only unbiased tests with slope zero at θ 0. Assumptions (i) β ϕ (θ) is twice continuously differentiable in the neighbourhood of θ 0 for every ϕ. (ii) β ϕ(θ) = ϕ p θ(x) dx. (iii) β ϕ(θ) = By Mean value theorem ϕ 2 p θ (x) dx. 2 β ϕ (θ) = β ϕ (θ 0 ) + (θ θ 0 )β ϕ(θ 0 ) + (θ θ 0) 2 β 2! ϕ(θ ), min(θ 0, θ) < θ < max(θ 0, θ). Locally best unbiased tests A test ϕ 0 is called locally best unbiased test at size α for testing, H 0 : θ = θ 0 vs. H : θ θ 0 if out of all ϕ satisfying β ϕ (θ 0 ) = α and β ϕ(θ 0 ) = 0 the test ϕ 0 maximizes the value of the second derivative at θ 0, that is, β ϕ 0(θ 0) β ϕ(θ 0 ) 4
. When such a test. is unique, the optimum property of a locally best un- biased test may be stated as follows: A test ϕ 0 is called LMP test at size α for testing, if for some ɛ > 0, H : θ = θ 0 vs. H A : θ θ 0 (i) β ϕ0 (θ 0 ) = α (ii) β ϕ0 (θ) β ϕ (θ), θ θ 0 < ɛ, θ θ 0 where, β ϕ (θ) is the power function of the test ϕ and ϕ satisfies (i). Our aim is to minimize subject to, and ϕ 2 p θ (x) 2 θ=θ0 dx ϕp θ (x)dx = α, ϕ p θ(x) θ=θ0 dx = 0, ϕ 0 (x) = if 2 p θ (x) 2 0 if 2 p θ (x) 2 and, E θ0 ϕ 0 = α and E θ 0 ϕ 0 = 0. Note that, p θ (x) θ=θ0 > k p θ0 (x) + k 2 θ=θ0 p θ (x) θ=θ0 < k p θ0 (x) + k 2 θ=θ0. = p θ (x) p θ (x) and, 2 log p θ (x) = p 2 (p θ (x)) 2 θ (x) 2 p θ (x) 2 ( ) 2 pθ (x) 5
Therefore, ϕ 0 (x) can be rewritten as, ( ϕ 0 (x) = if 2 log p θ (x) log pθ (x) 2 θ=θ0 + ( 0 if 2 log p θ (x) log pθ (x) 2 θ=θ0 + Example. Let X,..., X n C(θ, ) To test, H 0 : θ = 0 vs. H : θ 0 Since, X i C(θ, ), the joint density is given by, p θ (x) = ) 2 θ=θ0 > k + k 2 θ=θ0 ) 2 θ=θ0 < k + k 2 θ=θ0. π n n ( + (x i θ) 2 ) and also, 2(x i θ) θ=0 = + (x i θ) 2x i 2 θ=0 = + x 2 i 2 log p θ (x) [ + (x i θ) 2 ] + 2(x i θ) 2x 2 i 2 θ=0 = 2 [ + (x i θ) 2 ] 2 θ=0 = 2 ( + x 2 i ). 2 Therefore, LMP test is given by, ϕ 0 (x) = if v > k + k 2 u 0 if v < k + k 2 u. where, u = n 2x i, and v = 2 n 2x 2 +x 2 i +u 2. Also, k i (+x 2, k 2 are such that, E θ=0 ϕ 0 = i )2 α, and E θ=0ϕ 0 = 0. Observe that, E θ ϕ = ϕp θ (x)dx. Hence, and Now, E θ ϕ θ=0 = E θ0 ϕ 0 U = E θ0 UI (V > k + k 2 U) ϕ log p θ(x) θ=0 p θ (x), p ϕ(θ 0 ) = 0 E θ0 ϕu = 0. = E θ0 UI (U > 0, V > k + k 2 U) + E θ0 UI (U < 0, V > k + k 2 U) = E θ0 UI (U > 0, V > k + k 2 U) E θ0 UI (U > 0, V > k k 2 U) = E θ0 UI (U > 0, k k 2 U < V < k + k 2 U). 6
Therefore, E θ0 ϕ 0 U = 0 k 2 = 0. Hence the above test is given by, ϕ 0 (x) = if v > k 0 if v < k. where k is such that, E θ0 ϕ 0 = α. 7