[PDF] Some Formulas of Mean and Variance: We consider two random





Previous PDF Next PDF



New York University

If X and Y are independent find E[ XY ]



Random Variability: Covariance and Correlation

%20covariance%20and%20correlation.pdf



Chapter 3: Expectation and Variance

Expectation of XY : the definition of E(XY ). Suppose we have two random variables X and Y . These might be independent



Some Formulas of Mean and Variance: We consider two random

If X is independent of Y the second equality holds



Chapter 3: Expectation and Variance

Expectation of XY : the definition of E(XY ). Suppose we have two random variables X and Y . These might be independent



Recursive Estimation - Problem Set: Probability Review

28 Feb 2011 Let x and y be independent random variables with means µx and µy and variances ?2 x and ?2 y respectively. Show that. Var[xy] = ?2.



Covariance and Correlation Math 217 Probability and Statistics

Notice that the variance of X is just the covariance E(XY ) ? µXE(Y ) ? E(X)µY + µXµY ... If X and Y are independent variables then their.



Lecture 6: Discrete Random Variables

19 Sept 2005 This isn't the only time that E [XY ] = E [X] E [Y ] though. Here's where independence gets important: what's the variance of X + Y ?



Expectations

As with the variance Cov(X



Chapter 5. Multiple Random Variables 5.4: Covariance and

helps us finally compute the variance of a sum of dependent random actually proved in 5.1 already that E [XY ] = E [X] E [Y ] when X Y are independent.



Practice problems — Solutions - University of Illinois

Since J K L are independent the moment-generating function for their sum X is equal to the product of the individual moment-generating functions i e M X(t) = M K(t)M J(t)M L(t) = (1?2t)?3?2 5?4 5 = (1?2t)?10 Di?erentiating this function we get M0(t) = (?2)(?10)(1?2t)?11 M00(t) = (?2)2(?10)(?11)(1?2t)?12

Is var(x+y)=var(X)+var (Y) when X and Y are two independent?

Let’s define X and Y to each be the result of the coin flip where we assign the value 1 to Heads and 0 to Tails. So X+Y take on the values 0,1 and 2 with probabilities 1/4,1/2 and 1/4. So So it is indeed true that Var (X+Y)=Var (X)+Var (Y) when X and Y are two independent random variables.

How to calculate var( x y) if X and Y are independent?

How i can calculate Var ( X Y) if X and Y are independent? I know this: Var ( X Y) = E ( X) 2 Var ( Y) + E ( Y) 2 Var ( X) + Var ( X) Var ( Y). But i need prove it.

Is there a block exogeneity in the VAR model?

There's no block exogeneity in your VAR model. Checking the p-values, I would suggest that in the first and second tests, the variables DRLM2 and DlRGDP have granger causality issues (since we reject for a 5% sig. level). In the third case, DRLM2 is not rejected for a sig. level of 5%, yet, it seems to be very close (4,8%).

What is experience vardaxyn and how does it work?

Vardaxyn is a revolutionary male enhancement product in the healthcare sector and it is essentially a new one designed to promote the health of your male body and sexual functions in a natural way. Vardaxyn is a natural ED cure supplement that heals all of the ejaculation problems in a completely natural way.

Some Formulas of Mean and Variance: We consider two random Some Formulas of Mean and Variance:We consider two random variablesXandY.

1.Theorem:E(X+Y)=E(X)+E(Y).

Proof:

For discrete random variablesXandY, it is given by:

E(X+Y)=?

i j (x i +y j )f xy (x i ,y j i j x i f xy (x i,y j i j y j f xy (x i ,y j =E(X)+E(Y).

119ForcontinuousrandomvariablesXandY, wecanshow:

E(X+Y)=?

(x+y)f xy (x,y)dxdy xf xy(x,y)dxdy yf xy (x,y)dxdy =E(X)+E(Y). 120

2.Theorem:E(XY)=E(X)E(Y), whenXis indepen-

dent ofY.

Proof:

For discrete random variablesXandY,

E(XY)=?

i j x i y j f xy (x i,y j i j x i y j f x (x i )f y (y j i x i f x (x i j y j f y (y j )?=E(X)E(Y).

IfXis independent ofY, the second equality holds,

i.e.,f xy (xi ,y j )=f x (x i )f y (y j

121For continuous random variablesXandY,

E(XY)=?

xyf xy (x,y)dxdy xyf x (x)f y (y)dxdy -∞xf x (x)dx??? yf y (y)dy?=E(X)E(Y).

WhenXisindependentofY,wehavef

xy (x,y)=f x (x)f y (y) in the second equality. 122

3.Theorem:Cov(X,Y)=E(XY)-E(X)E(Y).

Proof:

For both discrete and continuous random variables, we can rewrite as follows:

Cov(X,

Y)=E((X-μx

)(Y-μ y =E(XY-μ x

Y-μ

y

X+μ

x y =E(XY)-E(μ x

Y)-E(μ

y

X)+μ

x y =E(XY)-μ x

E(Y)-μ

y

E(X)+μ

x μy

123=E(XY)-μ

x y y x x y =E(XY)-μ x y =E(XY)-E(X)E(Y). In the fourth equality, the theorem in Section 3.1 is used, i.e., E(μ x

Y)=μ

x

E(Y) and E(μ

y

X)=μ

y E(X). 124

4.Theorem:Cov(X,Y)=0, whenXis independent of

Y.

Proof:

Fromtheabovetwotheorems, wehaveE(XY)=E(X)E(Y)

whenXis independent ofYand Cov(X,Y)=E(XY)-

E(X)E(Y).

Therefore, Cov(X,Y)=0 is obtained whenXis inde-

pendent ofY.

1255.Definition:Thecorrelation coefficient (૬܎ؔ

betweenXandY, denoted byρ xy , is defined as: xy =Cov(X,Y)⎷

V(X)⎷V(Y)=Cov(X,Y)

x y xy >0=?positive correlationbetweenXandY xy -→1=?strong positive correlation xy <0=?negative correlationbetweenXandY xy -→ -1=?strong negative correlation 126

6.Theorem:ρ

xy =0, whenXis independent ofY.

Proof:

WhenXis independent ofY,wehaveCov(X,Y)=0.

We obtain the resultρ

xy =Cov(X,Y)⎷

V(X)⎷V(Y)=0.

However, note thatρ

xy =0 does not mean the indepen- dence betweenXandY. 127

Proof:

Forbothdiscreteandcontinuousrandomvariables, V(X±

Y) is rewritten as follows:

V(X±Y)=E?((X±Y)-E(X±Y))

2 =E?((X-μ x )±(Y-μ y 2 =E((X-μ x 2

±2(X-μ

x )(Y-μ y )+(Y-μ y 2 128
=E((X-μ x 2 )±2E((X-μ x )(Y-μ y +E((Y-μ y 2 =V(X)±2Cov(X,Y)+V(Y). xy

Proof:

Consider the following function oft:f(t)=V(Xt-Y),

which is always greater than or equal to zero because of the definition of variance. Therefore, for allt,we havef(t)≥0.f(t) is rewritten as follows: 130
f(t)=V(Xt-Y)=V(Xt)-2Cov(Xt,Y)+V(Y) =t 2

V(X)-2tCov(X,Y)+V(Y)

=V(X)?t-Cov(X,Y) V(X)? 2 +V(Y)-(Cov(X,Y)) 2 V(X). In order to havef(t)≥0 for allt, we need the follow- ing condition:

V(Y)-(Cov(X,Y))

2

V(X)≥0,

because the first term in the last equality is nonnega- 131
tive, which implies:

Cov(X,Y))

2

Therefore, we have:

Fromthedefinitionofcorrelationcoefficient, i.e.,ρ xy

Cov(X,Y)

xy 132

9.Theorem:V(X±Y)=V(X)+V(Y), whenXis inde-

pendent ofY.

Proof:

Fromthetheoremabove, V(X±Y)=V(X)±2Cov(X,Y)+

V(Y) generally holds. When random variablesXand

Yare independent, we have Cov(X,Y)=0. Therefore,

V(X+Y)=V(X)+V(Y) holds, whenXis independent

ofY.

13310.Theorem:Fornrandom variablesX

1 ,X 2 ,···,X n E( i a i X i i a i i V( i a i X i i j a i a j Cov(X i ,X j where E(X i i anda i is a constant value. Espe- cially, whenX 1 ,X 2 ,···,X n are mutually independent, we have the following: V( i a i X i i a 2i V(X i 134

Proof:

For mean of

i a i X i , the following representation is obtained. E( i a i X i i E(a i Xquotesdbs_dbs33.pdfusesText_39
[PDF] bachelor conseiller patrimonial agence

[PDF] cfpb resultat bachelor

[PDF] probleme methode de newton

[PDF] probleme polynome mpsi

[PDF] serie de engel

[PDF] probleme moyenne de cesaro

[PDF] probleme serie numerique mpsi

[PDF] moyenne de cesaro d une fonction

[PDF] cpge.ac.ma intranet

[PDF] resultat cpge 2017 maroc

[PDF] cqqcoqp tableau

[PDF] cqqcoqp analyse

[PDF] cqqcoqp stss

[PDF] cqqcoqp communication

[PDF] fiche méthode qqoqcp