next up previous
Next: About this document ... Up: Content Questions Previous: How do you take

Can you clarify about limitations of using Bayes' Theorem in parameter estimation?

In parameter estimation, you are trying to determine the most likely value of a parameter in some model, given some data set. In very broad-brush terms, when we approach this with Bayesian parameter estimation, we identify in Bayes' Law $P(B\vert A) = \frac{P(A\vert B)
P(B)}{P(A)}$ the quantity $A$ with some experimental results and the quantity $B$ with some parameterized model. $P(A\vert B)$ is then identified with how we expect the data to vary depending on the parameters, which is typically known. $P(B)$ is called the ``prior'': it represents prior knowledge about the parameter. The quantity related to $P(A)$ comes from the data. What you are then after is $P(B\vert A)$, the ``posterior'', related to ``probability of the parameter given the data''. So to determine $P(B\vert A)$, you have to have some reasonable distribution to assign to the prior. If you don't really know anything about the parameter you are after, it can be tricky to decide what to use for the prior, and the answer can be sensitive to what you choose.

Bayes' Law is a tremendously powerful tool in many statistical contexts, in particular when a sensible prior can be determined, because it gives a highly effective way to incorporate knowledge you already have. However when it is hard to choose a sensible prior, the method can give dubious results.


next up previous
Next: About this document ... Up: Content Questions Previous: How do you take
Kate Scholberg 2014-10-24