Welcome to the website. This website introduces Gibbs Sampling in Bayesian statistics world. This is a class project from course MATH/ STAT 455: Mathematical Statistics at Macalester College. The contents in this blog are collaborative efforts from Kristy Ma and Regan Brodine.
Welcome to our project page on Gibbs Sampling - an example of Markov chain Monte Carlo (MCMC) algorithms that uses conditional probabilities to iteratively draw posterior samples for estimating the probabiltiy distribution of unknown parameters.
Our motivation: Gibbs sampling connects to our class because it is a sampling method used in Bayesian statistics. While frequentist statistics depend only on observed data, Bayesian statistics reflects both prior knowledge and observed data. When determining parameters in Bayesian statistics, it is labor intensive to compute the integration of multi-dimensional functions to estimate the posterior distribution.
This gives rise to Markov Chain Monte Carlo integration (MCMC), an easy way to generate dependent samples. Gibbs sampling is a type of MCMC-based numerical integration method that relies on conditional distributions. This technique also ties with a cool application in statistical genetics to infer genotypes in unknown population groups with Hardy Weinberg Equilibrium.
To start exploring, simply navigate to tabs on the upper right corner:
Learning Goals:
Be able to explain what Gibbs Sampling is and when to use it.
In addition to knowing the uses of Gibbs sampling, understand the limitations of Gibbs sampling, and what properties of trace plots evidence good performance.
Simulate data from Gibbs sampling framework using R
Understand that Gibbs sampling is a special scenario of Metropolis Hasting with acceptance rate = 1
Get to know the application of Gibbs sampler in Statistical Genetics
Have fun!
If you have any questions, please reach out to kristy20011001@gmail.com and rbrodine@macalester.edu. Thank you!