User:Orimosenzon/notes
From Wikipedia, the free encyclopedia
Contents |
[edit] Expectation
[edit] Expectation according to joint distribution equals single distribution Expectation


Hence:

Also:

[edit] linearity



hence:
E(X1 + X2) = E(X1) + E(X2)
| E(λX) = | ∑ | p(x)λx = λ | ∑ | p(x)x = λE(X) |
| x | x |
hence:
E(λX) = λE(X)
[edit] Variance & Standard deviation
[edit] definitions
V(X) = def = E([X − E(X)]2)

[edit] The meaning of standard deviation
One way to look at standard deviation is as an approximation of the "expected drift" from the expectation. The "expected drift" could be defined as:
ED(X) = def? = E( | X − E(X) | )
This value is probably not easy to manipulate.
Suppose that X can have only the two values k and − k and that E(X) = 0. Then:
V(X) = def = E([X − E(X)]2) = E(X2) = k2
and

and
ED(X) = E( | X − E(X) | ) = E( | X | ) = E(k) = k = σ(X)
V, σ and ED doesn't change by adding a constant so any random variable X that all its drifts are of the same absolute value k has σ(X) = ED(X).
Whenever the drift values are not the same, σ averages with bigger weights to bigger values while ED keep fair plane average. *todo*: show why
Example:Suppose you are performing the following experiment: you flip a coin, if it's head you go 5 meters to the left, if it is tail, you go 5 meters to the right. The variance in this case is 25 and the standard deviation is 5. The expected drift is also 5 (all the drift values are equal). More on that example, see here.
[edit] Alternative definition of variance
V(X) = def = E((X − E(X))2) = E(X2 + E2(X) − 2XE(X)) =
E(X2) + E2(X) − 2E2(X) = E(X2) − E2(X)
hence:
-
- V(X) = E(X2) − E2(X)
[edit] variance (and sd) doesn't change by adding a constant
V(X + c) = E([X + c − E(X + c)]2) = E([X + c − E(X) − E(c)]2) = E([X − E(X)]2) = V(X)
[edit] variance of multiplication
V(λX) = E((λX)2) − E2(λX) = λ2E(X2) − λ2E2(X) = λ2(E(X2) − E2(X)) = λ2V(X)
hence:
-
- V(λX) = λ2V(X)
[edit] SD of multiplication

hence:
-
- σ(λX) = λσ(X)
[edit] Variance of sum of random variables
V(X1 + X2) = E((X1 + X2)2) − E2(X1 + X2) =
hence:
V(X1 + X2) = V(X1) + V(X2) + 2Cov(X1,X2)
When X1 and X2 are independent, Cov(X1,X2) = 0 and hence:
X1,X2 Independent 
When X1 and X2 are i.i.d (identically independent distributed) then:
X1,X2 i.i.d 
Or more generally:
i.i.d 
hence:
i.i.d 
Note the difference from summing the variable with itself (identically distributed but not independent):
V(X1 + X1) = V(2X1) = 4 V(X1)
and
σ(X1 + X1) = σ(2X1) = 2 σ(X1)
[edit] more on the last result
We've showed that:
i.i.d 
Why is this important?
σ is a measure for expected drift. The last result shows that the expected drift goes as square root (less than linear) with successive experiments... this means that the mean drift tends to zero:

Recall the example of the random walk +-5. Now, suppose You repeat the process n times. What is the expected drift?
The standard deviation, which can be considered as a measure to that drift is: 
The mean drift is: 
For example, for 10000 iterations, the mean drift is:
meter. Instead of 5 meter in each step it is 5 centimeter. The total drift is only 500 instead of 50,000.
- todo:*...example of random walk +-5 gnuplot picture. the relation to the law of big numbers... the fact that frequent ration converges is an assumption in probability theory or a result?..
[edit] misc
is constant.
hence:
is constant.
[edit] Covariance
[edit] Alternative definition
Cov(X1,X2) = E((X1 − E(X1))(X2 − E(X2))) = E(X1X2) + E(X1)E(X2) − 2E(X1)E(X2) = E(X1X2) − E(X1)E(X2)
hence:
Cov(X1,X2) = E(X1X2) − E(X1)E(X2)
A special case is a covariance of two of the same random variable: Cov(X,X) = E(XX) − E(X)E(X) = V(X)
[edit] Covariance of independent variables
Assume that X1 and X2 are independent:

And hence:
X1,X2 independent 
The contrary is not true, however. For example, if X is a constant random variable then
COV(X,X) = V(X) = 0
But of course, X and X are very much dependent.
[edit] Wiener processes
(also known as "Brownian motion")
Let Z be a stochastic process with the following properties: 1. The change δZ in a small period of time δt is

where:
ε˜φ(0,1)
[edit] Summary
[edit] Expectation

- E(X1 + X2) = E(X1) + E(X2)
- E(λX) = λE(X)
[edit] Variance and standard deviation
- V(X) = E(X2) − E2(X)
- V(λX) = λ2V(X)
- σ(λX) = λσ(X)
- V(X1 + X2) = V(X1) + V(X2) + 2Cov(X1,X2)
- X1,X2 Independent

i.i.d 
i.i.d 
[edit] Covariance
- Cov(X1,X2) = E(X1X2) − E(X1)E(X2)
[edit] Misc
int main() { cout << "hello lord\n"; }




