Probability: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Ragnar Schroder
No edit summary
imported>Ragnar Schroder
mNo edit summary
Line 31: Line 31:




References: 
== External links ==
*Intros:
*Intros
** http://www.mathgoodies.com/lessons/vol6/intro_probability.html
** http://www.mathgoodies.com/lessons/vol6/intro_probability.html
** http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/book.html
** http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/book.html


*More advanced:
*More advanced
** http://bayes.wustl.edu/
** http://bayes.wustl.edu/
** http://plato.stanford.edu/entries/probability-interpret/
** http://plato.stanford.edu/entries/probability-interpret/

Revision as of 13:50, 8 February 2007

Probability

Probability theory is a branch of mathematics.

Like algebra, geometry and other parts of mathematics, probability theory has it's origins in the natural world. Humans routinely deal with incomplete and/or uncertain information in daily life, in decisions such as crossing the road ("will this approaching car respect the red light"), eating food ("am I certain this food is not contaminated?"), and so on. We make decisions based on basic intuition and experience.

Probability theory is a mathematical tool intended to formalize this ubiquitous mental process. The probability concept is a part of this theory, and is intended to formalize uncertainty.

There are two basic ways to think about the probability concept:

  • Subjective (Bayesian) probability.
  • Objective probability.


The different approaches are largely pedagogical, as some people find one approach or the other much easier.


Bayesian probability

In this approach probability theory is viewed as an extension to classical logic. The probabilities represent a state of knowledge. One startes with a set of propositions and all the available information. Using common sense based methods one assigns "weights" to the available alternatives, generating a "prior probability distribution". As more information comes in, one combines the "prior" with the new data and obtain a new "posterior" probability distribution, which represents an updated state of knowledge - each probability is a number that describes how much faith one should presently allocate to its associated proposition.

Objective probability

In this approach one views probabilities as "propensities" of the actual system under study - f.i. a fair coin will have a "propensity" to show heads 50% of the time. This approach is more restrictive than the Bayesian interpretation: F.i. there is no way to assign a probability as to whether or not there exists life in the Andromeda galaxy this way, since no "propensities" have been measured.


Example of the bayesian viewpoint

One is given a die, and no information about it. From the principle of maximum entropy, one assumes that all 6 outcomes are equally likely. Our state of knowledge as to the outcome is thus best modeled by putting equal "weights" to all 6 alternatives. This weight will be 1/6, since we give the alternatives 1/6th of our confidence each. This is then our prior probability distribution. Now, there is a possibility the die is "loaded". Therefore, we will watch the outcomes, and continually adjust our "weights" according to the results we obtain from throwing the die.


Example of the objective viewpoint

We are given a die, and a list of it's measured (or theoretical) "propensities", i.e. probabilities for each possible outcome. We then use this information to calculate the "propensities"/probabilities of certain outcomes. If our results seem improbable, we may decide to do experiments to re-measure the "propensities".


External links