Notes‎ > ‎

bayes theorem

posted 24 Feb 2012, 04:50 by David Sherlock   [ updated 24 Feb 2012, 04:52 ]
Thinking about recursion and self-organisation has led me down the scary road of Bayes Filters and Theorem.  Wikipedia promises:

A Bayes filter is an algorithm used in computer science for calculating the probabilities of multiple beliefs to allow a robot to infer its position and orientation.  

Which sounds pretty sexy. I was panicy over Bayes Theorem but it turns out, its pretty simple. I can thank IBM for a simple explanation. I had made a start on the understanding when I talked about Bob the agent that never took off, I thought about frequency counts P(A | B) instead of probabilities. Recasting this foruma as

P(A | B) = P(A & B) / P(B)

That is the probability of observing A given that B has happened equals the probability of a and b happeneing divided by the probability of B .We now have a definition of conditional probability. It’s only a few steps from here to Bayes theorem.

1) Rewrite the equation:

P(A & B) = P(A | B) P(B)

2) change A to the conditiong variable:

P(A & B) = P(B | A) P(A)

3) You can write:

P(A | B) P(B) = P(B | A) P(A)

4) You can simplify

P(A | B) = P(B | A) P(A) / P(B)

Tada. Baynes therom is:

p(hypothesis|data) = p(data|hypothesis) x p(hypothesis) / p(data).

Suddenly the whole world makes sense. A code example to follow. Now on to recursive bayesian estimation…

See also