The variance is the expectation of the squared deviation of a random variable form its mean. Informally, it measures how far a set of (random) numbers are spread out from their average value. The variance is the square of the standard deviation, the second central moment of a distribution and the covariance of the random variable with itself, and it is often represented by Var(X), σ 2 , s 2 . One of the most widely known formula for computing the variance is: where x-bar is the mean of the sample. The definition given above can be converted into an algorithm that computed the variance and the standard deviation in two passes: 1. Compute the mean (O(n)) 2. Compute the square differences (O(n)) Output the variance Even though this algorithm seems working properly, it may become too expensive on some input instances. Just consider a sampling procedu...
Chebyshev's Inequality In probability theory, the Chebyshev's inequality guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more distant than a certain value from the mean. In particular, the mentioned inequality states that no more than 1/k 2 of the distribution's values can be more than k standard deviations away from the mean. In other words, this mean that at least (1 - 1/k 2 ) of the distribution's values are within k standard deviations of the mean. The Chebyshev's inequality can be easily derived from the Markov's inequality, where the latter defines an upper bound for the probability that a non-negative random variable is greater than (or equal to) some positive integer constant. Remember the Markov's inequality where a > 0 and X is a nonnegative random variable The Chebyshev inequality follows by considering the random variable ( X - E ( X )) 2 and the constant a 2...
Before giving the Bayes theorem statement is important to define the conditional probability. Once done, the Bayes theorem is easy to derive. Conditional Probability The conditional probability measures the likelihood of an event given that another event has occurred. For instance, assuming that A is the event of interest and that B consists of a different event which has already occurred. The probability of the event A to occur, given the occurrence of B , (written as) P ( A | B ), can be computed as detailed by the following equality. P(A | B) = P(A ∩ B) / P(B) P( B ) > 0 (1) It is defined as the quotient of the probability of the joint events A and B and the probability of B . Now consider P(B | A), which is equals to P(B ∩ A) / P(A). Since P(B ∩ A) = P(A ∩ B), then the fo...
Commenti
Posta un commento