Solution Series 9. i=1 x i and i=1 x i.

Σχετικά έγγραφα
Other Test Constructions: Likelihood Ratio & Bayes Tests

Bayesian statistics. DS GA 1002 Probability and Statistics for Data Science.

ST5224: Advanced Statistical Theory II

2 Composition. Invertible Mappings

k A = [k, k]( )[a 1, a 2 ] = [ka 1,ka 2 ] 4For the division of two intervals of confidence in R +

Solutions to Exercise Sheet 5

An Inventory of Continuous Distributions

Statistical Inference I Locally most powerful tests

6.3 Forecasting ARMA processes

SOLUTIONS TO MATH38181 EXTREME VALUES AND FINANCIAL RISK EXAM

Homework 3 Solutions

Matrices and Determinants

Theorem 8 Let φ be the most powerful size α test of H

4.6 Autoregressive Moving Average Model ARMA(1,1)

Section 8.3 Trigonometric Equations

Inverse trigonometric functions & General Solution of Trigonometric Equations

Exercises to Statistics of Material Fatigue No. 5

Statistics 104: Quantitative Methods for Economics Formula and Theorem Review

Probability and Random Processes (Part II)

Ordinal Arithmetic: Addition, Multiplication, Exponentiation and Limit

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

SOLUTIONS TO MATH38181 EXTREME VALUES AND FINANCIAL RISK EXAM

EE512: Error Control Coding

ON NEGATIVE MOMENTS OF CERTAIN DISCRETE DISTRIBUTIONS

C.S. 430 Assignment 6, Sample Solutions

Lecture 34 Bootstrap confidence intervals

Additional Results for the Pareto/NBD Model

5.4 The Poisson Distribution.

HOMEWORK#1. t E(x) = 1 λ = (b) Find the median lifetime of a randomly selected light bulb. Answer:

Fourier Series. MATH 211, Calculus II. J. Robert Buchanan. Spring Department of Mathematics

Web-based supplementary materials for Bayesian Quantile Regression for Ordinal Longitudinal Data

Concrete Mathematics Exercises from 30 September 2016

Homework for 1/27 Due 2/5

Second Order Partial Differential Equations

Econ 2110: Fall 2008 Suggested Solutions to Problem Set 8 questions or comments to Dan Fetter 1

ENGR 691/692 Section 66 (Fall 06): Machine Learning Assigned: August 30 Homework 1: Bayesian Decision Theory (solutions) Due: September 13

CHAPTER 101 FOURIER SERIES FOR PERIODIC FUNCTIONS OF PERIOD

Example Sheet 3 Solutions

SCHOOL OF MATHEMATICAL SCIENCES G11LMA Linear Mathematics Examination Solutions

Homework 8 Model Solution Section

Chapter 6: Systems of Linear Differential. be continuous functions on the interval

Fractional Colorings and Zykov Products of graphs

Quadratic Expressions

Aquinas College. Edexcel Mathematical formulae and statistics tables DO NOT WRITE ON THIS BOOKLET

SCITECH Volume 13, Issue 2 RESEARCH ORGANISATION Published online: March 29, 2018

6. MAXIMUM LIKELIHOOD ESTIMATION

The Simply Typed Lambda Calculus

Every set of first-order formulas is equivalent to an independent set

CHAPTER 25 SOLVING EQUATIONS BY ITERATIVE METHODS

ΕΛΛΗΝΙΚΗ ΔΗΜΟΚΡΑΤΙΑ ΠΑΝΕΠΙΣΤΗΜΙΟ ΚΡΗΤΗΣ. Ψηφιακή Οικονομία. Διάλεξη 7η: Consumer Behavior Mαρίνα Μπιτσάκη Τμήμα Επιστήμης Υπολογιστών

Μηχανική Μάθηση Hypothesis Testing

Trigonometry 1.TRIGONOMETRIC RATIOS

b. Use the parametrization from (a) to compute the area of S a as S a ds. Be sure to substitute for ds!

Parametrized Surfaces

Section 7.6 Double and Half Angle Formulas

Exercises 10. Find a fundamental matrix of the given system of equations. Also find the fundamental matrix Φ(t) satisfying Φ(0) = I. 1.

Finite Field Problems: Solutions

Math221: HW# 1 solutions

A Note on Intuitionistic Fuzzy. Equivalence Relation

Areas and Lengths in Polar Coordinates

Introduction to the ML Estimation of ARMA processes

Solutions: Homework 3

Bayes Rule and its Applications

3.4 SUM AND DIFFERENCE FORMULAS. NOTE: cos(α+β) cos α + cos β cos(α-β) cos α -cos β

Areas and Lengths in Polar Coordinates

HOMEWORK 4 = G. In order to plot the stress versus the stretch we define a normalized stretch:

w o = R 1 p. (1) R = p =. = 1

2. THEORY OF EQUATIONS. PREVIOUS EAMCET Bits.

Chapter 6: Systems of Linear Differential. be continuous functions on the interval

DESIGN OF MACHINERY SOLUTION MANUAL h in h 4 0.

( y) Partial Differential Equations

D Alembert s Solution to the Wave Equation

Estimation for ARMA Processes with Stable Noise. Matt Calder & Richard A. Davis Colorado State University

Lecture 2: Dirac notation and a review of linear algebra Read Sakurai chapter 1, Baym chatper 3

Tridiagonal matrices. Gérard MEURANT. October, 2008

Jesse Maassen and Mark Lundstrom Purdue University November 25, 2013

forms This gives Remark 1. How to remember the above formulas: Substituting these into the equation we obtain with

S620 - Introduction To Statistical Theory - Homework 4. Enrique Areyan February 13, 2014

ORDINAL ARITHMETIC JULIAN J. SCHLÖDER

ΚΥΠΡΙΑΚΗ ΕΤΑΙΡΕΙΑ ΠΛΗΡΟΦΟΡΙΚΗΣ CYPRUS COMPUTER SOCIETY ΠΑΓΚΥΠΡΙΟΣ ΜΑΘΗΤΙΚΟΣ ΔΙΑΓΩΝΙΣΜΟΣ ΠΛΗΡΟΦΟΡΙΚΗΣ 19/5/2007

1. For each of the following power series, find the interval of convergence and the radius of convergence:

Lecture 21: Properties and robustness of LSE

1. A fully continuous 20-payment years, 30-year term life insurance of 2000 is issued to (35). You are given n A 1

Exponential Families

ω ω ω ω ω ω+2 ω ω+2 + ω ω ω ω+2 + ω ω+1 ω ω+2 2 ω ω ω ω ω ω ω ω+1 ω ω2 ω ω2 + ω ω ω2 + ω ω ω ω2 + ω ω+1 ω ω2 + ω ω+1 + ω ω ω ω2 + ω

Exercise 1.1. Verify that if we apply GS to the coordinate basis Gauss form ds 2 = E(u, v)du 2 + 2F (u, v)dudv + G(u, v)dv 2

Approximation of distance between locations on earth given by latitude and longitude

Uniform Convergence of Fourier Series Michael Taylor

New bounds for spherical two-distance sets and equiangular lines

F19MC2 Solutions 9 Complex Analysis

Trigonometric Formula Sheet

An Introduction to Signal Detection and Estimation - Second Edition Chapter II: Selected Solutions

Asymptotic distribution of MLE

Απόκριση σε Μοναδιαία Ωστική Δύναμη (Unit Impulse) Απόκριση σε Δυνάμεις Αυθαίρετα Μεταβαλλόμενες με το Χρόνο. Απόστολος Σ.

Problem Set 9 Solutions. θ + 1. θ 2 + cotθ ( ) sinθ e iφ is an eigenfunction of the ˆ L 2 operator. / θ 2. φ 2. sin 2 θ φ 2. ( ) = e iφ. = e iφ cosθ.

Problem Set 3: Solutions

Gaussian related distributions

MATH423 String Theory Solutions 4. = 0 τ = f(s). (1) dτ ds = dxµ dτ f (s) (2) dτ 2 [f (s)] 2 + dxµ. dτ f (s) (3)

557: MATHEMATICAL STATISTICS II RESULTS FROM CLASSICAL HYPOTHESIS TESTING

Reminders: linear functions

Transcript:

Lecturer: Prof. Dr. Mete SONER Coordinator: Yilin WANG Solution Series 9 Q1. Let α, β >, the p.d.f. of a beta distribution with parameters α and β is { Γ(α+β) Γ(α)Γ(β) f(x α, β) xα 1 (1 x) β 1 for < x < 1, otherwise. Suppose that X 1,, X n form a random sample from the Bernoulli distribution with parameter θ, which is unknown ( < θ < 1). Suppose also that the prior distribution of θ is the beta distribution with parameters α > and β >. Show that the posterior distribution of θ given that X i x i (i 1,..., n) is the beta distribution with parameters α + n i1 x i and β + n n i1 x i. In particular the family of beta distributions is a conjugate family of prior distributions for samples from a Bernoulli distribution. If the prior distribution of θ is a beta distribution, then the posterior distribution at each stage of sampling will also be a beta distribution, regardless of the observed values in the sample. First we calculate the joint p.f. of X 1,, X n, θ: f X1,,X n,θ(x 1,, x n, x) x y n y (1 x) Γ(α)Γ(β) xα 1 (1 x) β 1 Γ(α)Γ(β) xα+y 1 (1 x) β+n y 1 where y x 1 + + x n. So that the marginal p.f. of X 1,, X n at (x 1,, x n ) is 1 Γ(α)Γ(β) xα+y 1 (1 x) β+n y 1 dx Thus the conditional p.d.f. of θ given x 1,, x n is f θ x1,,x n (x) Γ(α)Γ(β) Γ(α + y)γ(β + n y). Γ(α + β + n) Γ(α + β + n) Γ(α + y)γ(β + n y) xα+y 1 (1 x) β+n y 1, which is the Beta distribution with parameters α + y and β + n y. 1

Q2. Let ξ(θ) be defined as follows: for constants α > and β > : ξ(θ) { Γ(α) θ (α+1) e β/θ for θ >, for θ. A distribution with this p.d.f. is called an inverse gamma distribution. (a) Verify that ξ(θ) is actually a p.d.f.. (b) Consider the family of probability distributions that can be represented by a p.d.f. ξ(θ) having the given form for all possible pairs of constants α > and β >. Show that this family is a conjugate family of prior distributions for samples from a normal distribution with a known value of the mean µ and an unknown value of the variance θ. (a) The integral of ξ is βα Γ(α) 1 Γ(α) 1 Γ(α) θ (α+1) e β/θ dθ β y 2 β (α+1) y α+1 e y dy y α 1 e y dy (b) As we did before, let Θ be r.v. following the inverse gamma distribution with parameter α and β, let X be a random variable such that the conditional p.d.f. given Θ θ is The conditional p.d.f. of Θ given X x is f Θ x (θ) f X θ (x) 1 2πθ e (x µ)2 /2θ. ξ(θ)f X θ (x) ξ(θ)f X θ (x)dθ ξ(θ)f X θ (x) Γ(α) 2π Γ(α) 2π θ (α+1/2) 1 e (β+(x µ)2 /2)/θ θ (α+1/2) 1 e (β+(x µ)2 /2)/θ dθ Γ(α) Γ(α+1/2) 2π [β+(x µ) 2 /2] α+1/2 [β ] α Γ(α ) θ α 1 e β /θ. Where we set α α + 1/2 and β β + (x µ) 2 /2. The conditional distribution of Θ given X x is an inverse Gamma distribution with parameter α and β. Thus the family of inverse Gamma distribution is a family of prior distributions for samples from a normal distribution with a known mean µ and an unknown value of the variance. 2

Q3. Suppose that the number of defects in a 12-foot roll of magnetic recording tape has a Poisson distribution for which the value of the mean θ is unknown, and the prior distribution of θ is the gamma distribution with parameters α 3 and β 1. When five rolls of this tape are selected at random and inspected, the numbers of defects found on the rolls are 2, 2, 6, and 3. (a) What is the posterior distribution of θ? (b) If the squared error loss function is used, what is the Bayes estimate of θ? (a) Recall that X follows the Poisson distribution with mean θ then And θ has the p.d.f. θ θk P(X k) e k!. f θ (x) x3 1 e x Γ(3) The conditional p.d.f. of θ is obtained as before: x 2 e x /2. f θ 2,2,6,,3 (x) x2 e x /2 e 5x x 2+2+6++3 /(2!2!6!!3!) s 2 e s /2 e 5s s 2+2+6++3 /(2!2!6!!3!)ds e 6x x 15 e 6s s 15 ds e 6x 15 Γ(16) x : ξ(x 2, 2, 6,, 3) 6 16 Remark also that the posterior distribution of θ is the Gamma distribution with parameter (α 16, β 6). (b) The squared error loss function is L(θ, a) (θ a) 2. The Bayes estimator of θ is the a which minimizes E[L(θ, a) observation] L(x, a)ξ(x observation)dx, where ξ(θ observation) is the posterior p.d.f. of θ given the observation. We have computed the k-th moments of Gamma distribution X with parameter α and β in previous series, in particular E(X) α β, E(X2 ) α(α + 1) β 2. 3

Hence E[(θ a) 2 2, 2, 6,, 3] E[θ 2 2, 2, 6,, 3] 2aE[θ 2, 2, 6,, 3] + a 2 17 16 2a 16 6 2 6 + a2 ( a 8 ) 2 + 4 3 9. The Bayes estimate for the observation 2, 2, 6,, 3 is 8/3. Q4. Let c > and consider the loss function L(θ, a) Assume that θ has a continuous distribution. (a) Let a q be two real numbers, show that { c θ a if θ < a, θ a if θ a. E[L(θ, a) L(θ, q)] (q a)[p(θ q) cp(θ q)], and E[L(θ, a) L(θ, q)] (q a)[p(a θ) cp(θ a)]. (b) Prove that a Bayes estimator of θ will be any 1/(1 + c) quantile of the posterior distribution of θ. Let θ follows a continuous distribution, (a) let a q, E[L(θ, a) L(θ, q)] E[(L(θ, a) L(θ, q))(1 θ a + 1 a θ q + 1 q θ )] Similarly we have E[c(a θ q + θ)1 θ a ] + E[(θ a c(q θ))1 a θ q ] + E[(θ a θ + q)1 q θ ] c(a q)p(θ a) + (1 + c)e(θ1 a θ q ) (a + cq)p(a θ q) + (q a)p(q θ) (q a)[p(q θ) cp(θ a)] + a(1 + c)p(a θ q) (a + cq)p(a θ q) (q a)[p(q θ) cp(θ q)]. E[L(θ, a) L(θ, q)] (q a)[p(a θ) cp(θ a)]. 4

(b) Let q be a 1/(1 + c)-quantile of θ. Then For all a q, by the first inequality, For all a q, by the second inequality, P(θ q) 1/(1 + c) and P(θ q) c/(1 + c). E(L(θ, a) L(θ, q)). E(L(θ, a) L(θ, q)) (q a)[p(q θ) cp(θ q)]. Thus q minimizes E[L(θ, a)] among all a R, is the Bayes estimator with loss function L. 5