jump to navigation

MCMC June 12, 2007

Posted by jyu in Qualifying.
trackback
  1. MCMC methods are ideally suited for models build from a sequence of condotional distribution, often called hierarchical models.  Bayesian hierarchical models offer tremendous flexibility and modularity and are particularly useful for marketing problems
  2. Marketing applications using Bayesian methods include: Discrete Choice Models, Conjoint Analysis, Effects of purchase timing, satiation, determinants of heterogeneity, brand preferences, and etc.
  3. The emergence of MCMC methods has eliminated the computation bottleneck related to the Bayes Theorem.  MCMC substitute a set of repetitive calculations, that in effect, simulate draws from the posterior distribution.  These MCMC draws are then used to calculate statistics of interest such as parameter estimates and confidence interval.  The idea behind MCMC engine that drives the HB revolution is to set up a Markov chain that generates draws from the posterior distribution of the model parameters
  4. To obtain posterior results, MCMC simulation is often used.  The unobserved vairables may be siulated alongside the model parameters from their posterior distribution.  This technique is called Data Augmentation.  Given simulated unobservables, obtain the likelihood function conditional on the unobservable variables.
  5. Two often applied MCMC samplers are the Gibbs Sampler and the Metropolis-Hasting sample.
  6. The Gibbs sampler is a MCMC method of sampling probability densities.  This method has made possible the Bayesian approach to the estimation of nonlinear panel data models providing accurate finite sample estimates
  7. Gibbs Sampling: an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variables.  The purpose of such a sequence is to approximate the joint distribution or to compute an integral.  Gibbs sampling is a special case of the Metropolis-Hasting.  It is applicable when the joint distribution in not known explicitly, but the conditional distribution of each variable is known.
  8. Metropolis-Hastings algorithm is rejection sampling algorithm used to generate a sequence of samples from a probability distribution that is difficult to directly sample from.  This sequence can be used in MCMC simulation to approximate the distribution or to compute an integral. 

Steps of algorithm implemented for estimation of the random effects Tobit model:

  1. Run a GLS estimation with the original truncated data to fix the initial values
  2. Sample the censored variables to build the argumented dataset
  3. Run a GLS estimation on the panel with the augmented dataset for computing new mean values
  4. Draw Betas from the distribution
  5. Estimate the individual effects using the residuals from the previous steps
  6. Draw w from the distribution
  7. Draw sigma from the distribution
Advertisements

Comments»

No comments yet — be the first.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: