In probability theory , the general[ 1] form of Bienaymé's identity states that
Var
(
∑
i
=
1
n
X
i
)
=
∑
i
=
1
n
Var
(
X
i
)
+
2
∑
i
,
j
=
1
i
<
j
n
Cov
(
X
i
,
X
j
)
=
∑
i
,
j
=
1
n
Cov
(
X
i
,
X
j
)
{\displaystyle \operatorname {Var} \left(\sum _{i=1}^{n}X_{i}\right)=\sum _{i=1}^{n}\operatorname {Var} (X_{i})+2\sum _{i,j=1 \atop i<j}^{n}\operatorname {Cov} (X_{i},X_{j})=\sum _{i,j=1}^{n}\operatorname {Cov} (X_{i},X_{j})}
.
This can be simplified if
X
1
,
…
,
X
n
{\displaystyle X_{1},\ldots ,X_{n}}
are pairwise independent or just uncorrelated , integrable random variables , each with finite second moment .[ 2] This simplification gives:
Var
(
∑
i
=
1
n
X
i
)
=
∑
k
=
1
n
Var
(
X
k
)
{\displaystyle \operatorname {Var} \left(\sum _{i=1}^{n}X_{i}\right)=\sum _{k=1}^{n}\operatorname {Var} (X_{k})}
.
The above expression is sometimes referred to as Bienaymé 's formula. Bienaymé's identity may be used in proving certain variants of the law of large numbers .[ 3]
Estimated variance of the cumulative sum of iid normally distributed random variables (which could represent a gaussian random walk approximating a Wiener process ). The sample variance is computed over 300 realizations of the corresponding random process.
See also
References