Insight 2 - Kolmogorov Axioms and Relative Frequency
Probability theory is a branch of mathematics concerned with the analysis of phenomena, characterised by randomness or uncertainty, in order to model/predict the behaviour of defined systems. Although there are several different probability interpretations, such as the propensity interpretation or the subjective one, the most used and established probability theory is due to Andrey N. Kolmogorov, a Russian mathematician which combined previous studies on the field and presented his axiom system for probability theory in 1933.
This blog post is intended to introduce the reader to the main axioms and rules regarding the Kolmogorov's theory.
Let S denote a sample space with a probability measure P defined over it, such that the probability of any event E ⊂ S is given by P(E). Then, the probability measure obeys the following axioms:
AXIOM 1 - The probability of an event is a non-negative real number:
AXIOM 2 - Unit measure: the probability that some even will occur is 1, which can be written as:
AXIOM 3 - For any sequence of mutually exclusive sets, the probability of the union of those sets is equal to the summation of the probability for each set in the sequence (aka σ-additivity). Formally, if {E1, E2, ..., En} is a sequence of mutually exclusive events such that Ei ∩ Ej = ∅ for all i, j, then
Given the Kolmogorov axioms, one can deduce other rules which are useful for calculating probabilities.
Monotonicity
If E is a subset of F or if it's equal to the latter, than the probability of E is less than or equal to the probability of F.
Conditional Probability
This is a measure of the probability of an event, given that another event has occurred.
Given two event E and F from a probability space, with P(E) > 0, the probability of E given that F has occurred is defined as:
Boole's Inequality
Which is also known as the union bound, states that for any finite or countable set of events (e.g. A and B), the probability that at least one event occurs is no greater than the sum of the probabilities of the individual events.
This blog post is intended to introduce the reader to the main axioms and rules regarding the Kolmogorov's theory.
Let S denote a sample space with a probability measure P defined over it, such that the probability of any event E ⊂ S is given by P(E). Then, the probability measure obeys the following axioms:
AXIOM 1 - The probability of an event is a non-negative real number:
P(E) ∈ ℝ, P(E) ≥ 0 ∀E ∈ S
AXIOM 2 - Unit measure: the probability that some even will occur is 1, which can be written as:
P(S) = 1
AXIOM 3 - For any sequence of mutually exclusive sets, the probability of the union of those sets is equal to the summation of the probability for each set in the sequence (aka σ-additivity). Formally, if {E1, E2, ..., En} is a sequence of mutually exclusive events such that Ei ∩ Ej = ∅ for all i, j, then
P(E1 ∪ E2 ∪ ... ∪ En) = P(E1) + P(E2) + ... + P(En).
Given the Kolmogorov axioms, one can deduce other rules which are useful for calculating probabilities.
Monotonicity
If E is a subset of F or if it's equal to the latter, than the probability of E is less than or equal to the probability of F.
If E ⊆ F, then P(E) ≤ P(F)
Numeric Bound
0 ≤ P(E) ≤ 1. ∀E ∈ F
Additional Law of Probability
P(E ∪ F) = P(E) + P(F) - P(E ∩ F)
which is, in brief, an application of the inclusion-exclusion principle with only two sets.
Conditional Probability
This is a measure of the probability of an event, given that another event has occurred.
Given two event E and F from a probability space, with P(E) > 0, the probability of E given that F has occurred is defined as:
P(E|F) = P(E ∩ F) / P(F)
Boole's Inequality
Which is also known as the union bound, states that for any finite or countable set of events (e.g. A and B), the probability that at least one event occurs is no greater than the sum of the probabilities of the individual events.
Relative Frequency
Relative frequency is defined as the ratio between the number of times k in which a particular event has occurred during an experiment, and the total number of tries that have been carried out, n. Thus if we define f as the relative frequency concerning a certain event, it can be easily computed by f = k / n.
The subtle relationship that link the notion of relative frequency to the definition of probability will be already clear to the majority of readers. Relative frequency can be viewed as an estimation of the probability regarding a certain event.
Consider, for instance, 6 independent flips of a fair coin. We know that, for every try, the likelihood of each outcome is equal to 1/2 = 50% and thus we expect to see 3 heads and 3 tails. Unfortunately, we won't always end up with the expected results. However, if we would be able to repeat the experiment an infinite number of times we would see that the value of the frequency get closer to the probability of the event, which is 0.5. In fact, we can view the probability as the limit of the frequency when n towards infinity.
limn→∞ k / n
Commenti
Posta un commento