Select Page

.

# In the context of Example 7.3.2, construct a density histogram of the posterior distribution of…

In the context of Example 7.3.2, construct a density histogram of the posterior distribution of  Estimate the posterior mean of this distribution and assess the error in your approximation.

Example 7.3.2 Suppose now that  is a sample from a distribution that is of the form X =  the mean and  is the standard deviation of the distribution Note that λ=∞ corresponds to normal variation, while λ= 1 corresponds to Cauchy variation.

We will fix λ at some specified value to reflect the fact that we are interested in modeling situations in which the variable under consideration has a distribution with longer tails than the normal distribution. Typically, this manifests itself in a histogram of the data with a roughly symmetric shape but exhibiting a few extreme values out in

the tails, so a t (λ)  distribution might be appropriate.

Suppose we place the prior on  Gamma The likelihood function is given by

This distribution is not immediately recognizable, and it is not at all clear how to generate

from it.

It is natural, then, to see if we can implement Gibbs sampling. To do this directly, we need an algorithm to generate from the posterior of µ given the value of σ2 and an algorithm to generate from the posterior of σ      given µ Unfortunately, neither of these conditional distributions is amenable to the techniques discussed in Section 2.10,

so we cannot implement Gibbs sampling directly.