Noname manuscript No. (will be inserted by the editor Empirical best prediction under area-level Poisson mixed models Miguel Boubeta María José Lombardía Domingo Morales eceived: date / Accepted: date A Appendix A.1 Components of MM Newton-aphson algorithm The MM Newton-aphson algorithm is specified if we calculate the expectations appearing in f(θ and its partial derivatives. The expectation of y d is E θ y d E v Eθ y d v E vν d p d ν d exp x d β + 1 φ}., the first p MM equations are f k (θ ν d exp x d β + 1 φ} x dk y d x dk, k 1,..., p. The derivatives of E θ y d are E θ y d β k ν d exp x d β + 1 φ} x dk, E θ y d φ ν d exp x d β + 1 φ} φ. The expectation of y d is E θy d Ev Eθ y d v, where E θ y d v var θy d v + E θ y d v ν d p d + ν d p d, Miguel Boubeta Facultad de Informática, Universidade da Coruña, A Coruña, Spain Tel.: +34 981167000 E-mail: miguel.boubeta@udc.es María José Lombardía Facultad de Informática, Universidade da Coruña, A Coruña, Spain Tel.: +34 981167000 E-mail: maria.jose.lombardia@udc.es Domingo Morales Centro de Investigación Operativa, Universidad Miguel Hernández de Elche, Elche, Spain Tel.: +34 966658709 E-mail: d.morales@umh.es
Miguel Boubeta et al. and therefore E θ yd Ev Eθ yd v ν d p d f v(v d dv d + νd p d fv(v d dv d ν d exp x d β + 1 φ} + ν d exp x d β + φ }, where the p.d.f. of v d is assumed N(0, 1. Then, the last MM equation is f p+1 (θ The derivatives of E θ y d are ν d exp x d β + 1 φ} + ν d exp x d β + φ }} yd. E θ y d β k ν d exp x d β + 1 φ} x dk + ν d exp x d β + φ } x dk, E θ yd ν d exp x d β + 1 φ φ} φ + 4νd exp x d β + φ } φ. The elements of the Jacobian matrix in the Newton-aphson algorithm are H kr f k(θ H p+1r f p+1(θ E θ y d x dk, k 1,..., p, r 1,..., p + 1, E θ y d, r 1,..., p + 1. A. Calculation of the MSE of the EBP. The MSE of the EBP can be decomposed in the following form MSE(ˆp d E (ˆp d (ˆθ p d (θ, v d E ( ˆp d (ˆθ ˆp d (θ} + ˆp d (θ p d (θ, v d } E (ˆp d (ˆθ ˆp d (θ + E (ˆp d (θ p d (θ, v d, because E (ˆp d (ˆθ ˆp d (θ (ˆp d (θ p d (θ, v d E (ˆp d (ˆθ ˆp d (θ E ˆp d (θ p d (θ, v d y d 0. The second term of MSE(ˆp d is the MSE of the BP, namely g d (θ E (ˆp d (θ p d (θ, v d E ˆp d (θ + E p d (θ, v d E ˆp d (θe p d (θ, v d y d E p d (θ, v d E ˆp d (θ. The first term of g d (θ is E p d (θ, v d expx d β + φv d } f(v d dv d expx d β + φ }. The second term of g d (θ is E ˆp d (θ Eψd (y d, θ ψd (j, θp d(j, θ,
Empirical best prediction under area-level Poisson mixed models 3 where p d (j, θ P (y d j P (y d j v d f(v d dv d νj d expj(x d β + φv d ν d expx d β + φv d }} f(v d dv d νj d j! j! D d(j, θ. Concerning the first term of MSE(ˆp d, we expand ˆp d (ˆθ in Taylor series around θ and we have ( ˆp d (ˆθ ˆp d (θ ψ d (y d, ˆθ ψ d (y d, θ θ ψ d(y d, θ (ˆθ θ + 1 ( (ˆθ θ θ ψ d(y d, θ (ˆθ θ + o( ˆθ θ. We assume that the x d s are bounded and fulfill the regularity condition (3 of Jiang (003. Then, we have ˆθ θ O(D / and E (ˆpd (ˆθ ˆp d (θ (( 1 D E θ ψ d(y d, θ D(ˆθ θ + o(1/d. Now we consider ˆθ d, an estimator based on y d (y d d d, and write ˆp d ψ d (y d, ˆθ d. Then, by the independence of y d and y d, we have ( a d (θ E θ ψ d(y d, θ D(ˆθd θ ( E θ ψ d(y d, θ D(ˆθd θ y d j p d (j, θ where V d (θ DE DE (ˆθ d θ(ˆθ d θ., ( θ ψ d(j, θ V d (θ θ ψ d(j, θ p d (j, θ, (ˆθ d θ(ˆθ d θ y d j MSE(ˆp d g d (θ + 1 D a d(θ + o(1/d. If the x d s also fulfill the regularity conditions (4 and (5 of Jiang (003, then we may replace ˆθ d by ˆθ, an estimator of θ based on all data, and we obtain MSE(ˆp d g d (θ + 1 D c d(θ + o(1/d, where ( c d (θ θ ψ d(j, θ V (θ DE (ˆθ θ(ˆθ θ. ( V (θ θ ψ d(j, θ p d (j, θ, A.3 Proof of proposition 4.1. A first order multivariate Taylor expansion of M(ˆθ around θ yields to M(ˆθ M(θ + (ˆθ θ + o ( ˆθ θ.
4 Miguel Boubeta et al. ˆθ θ ( ( M(ˆθ M(θ + o ˆθ θ. (A.1 Let us consider a second-order Taylor expansion of the kth component of M(ˆθ, denoted by M k (ˆθ, around θ and substitute (A.1 in the quadratic term, i.e. M k (ˆθ M k (θ + θ M k(θ (ˆθ θ + 1 ( (ˆθ θ θ M k(θ (ˆθ θ + o ( ˆθ θ ( ( ( M k (θ + θ M k(θ (ˆθ θ + 1 ( M(ˆθ M(θ ( θ M k(θ ( ( M(ˆθ M(θ + o ˆθ θ. The corresponding multivariate Taylor expansion of M(ˆθ around θ is M(ˆθ M(θ + (ˆθ θ + 1 ( ( θ M k(θ M(θ + ( col 1 k p+1 ( ( ( (M(ˆθ M(θ ( M(ˆθ M(θ + o ( ˆθ θ (ˆθ θ + 1 D q D + o( ˆθ θ. ˆθ θ (M(ˆθ 1 D q D + o ( ˆθ θ. (A. By substituting (A. in the expression of b d (θ, given in (7, we obtain b d (θ θ g d(θ D 1 EM(ˆθ D Eq D } + 1 D (M(ˆθ E 1 D q D ( ( ( θ g d(θ D (M(ˆθ 1 D q D + Do ( ˆθ θ. On the one hand, we substitute M(ˆθ by ˆM, so that EM(ˆθ M(θ E ˆM M(θ 0 by taking expectations in the natural equations of the MM algorithm. On the other hand, all the quadratic forms in the second summand containing q D are o(1. b d (θ 1 θ g d(θ Eq D ( ( + 1 D E ( ( M(ˆθ M(θ θ g d(θ D ( M(ˆθ M(θ + o(1 1 Er D,d θ g d(θ Eq D } + o(1 B d (θ + o(1.
Empirical best prediction under area-level Poisson mixed models 5 A.4 Derivatives of a real-valued function h(θ Let h : n be a twice continuously differentiable real-valued function. Let us define the column vectors θ col (θr 1 r n n, e r (0,..., 0, 1 (r, 0,..., 0 col (δ ir, e rs 1 i n e r + e s, where δ ij 0 if i j and δ ij 1 if i j. For ε > 0, a first order Taylor expansion of h(θ + εe r and h(θ εe r around θ yields to h(θ + εe r h(θ + h(θ ε + o(ε, h(θ εe r h(θ h(θ ε + o(ε, By substraction, we get h(θ + εe r h(θ εe r ε h(θ + o(ε. h(θ 1 h(θ + εer h(θ εe } r + o(ε. (A.3 ε For ε > 0, a second order Taylor expansion of h(θ + εe r around θ yields to By applying (A.3, we get h(θ + εe r h(θ + h(θ ε + 1 h(θ θr ε + o(ε. h(θ + εe r h(θ + 1 h(θ + εer h(θ εe } r + 1 h(θ θr ε + o(ε. h(θ θr 1 h(θ + εer ε + h(θ εe r h(θ } + o(ε. (A.4 For ε > 0, a second order Taylor expansion of h(θ + εe rs and h(θ εe rs around θ yields to h(θ + εe rs h(θ + h(θ h(θ εe rs h(θ h(θ By summation, we get h(θ + εe rs + h(θ εe rs h(θ + h(θ θ r For r s, we obtain ε + h(θ ε + 1 h(θ θ s θr ε + 1 h(θ θs ε + h(θ ε + o(ε, ε h(θ ε + 1 h(θ θ s θr ε + 1 h(θ θs ε + h(θ ε + o(ε, ε + h(θ θ s ε + h(θ ε + o(ε. h(θ 1 ε h(θ + εers + h(θ εe rs h(θ } ε h(θ θ r + h(θ θ s } + o(ε. (A.5