Reminder No. 1: Uncorrelated vs. Independent









Covariance and Correlation

28 Jul 2017 The reverse is not true in general: if the covariance of two random variables is 0 they can still be dependent! Page 2. –2–. Properties of ...
covariance


Scatterplots and Correlation

Measuring Linear Association: Correlation Calculate and interpret correlation. ... and motivation scores in this example range from 0 to 100.
scatterplots and correlation notes


Reminder No. 1: Uncorrelated vs. Independent

27 Feb 2013 If ρ(XY) = 0
uncorrelated vs independent


Pearson's correlation

We can categorise the type of correlation by considering as one variable increases The first three represent the “extreme” correlation values of -1 0.
pearsons





New Automatic Search Tool for Impossible Differentials and Zero

Abstract. Impossible differential and zero-correlation linear cryptanalysis are two of the most powerful cryptanalysis methods in the field of symmetric key 


Links among Impossible Differential Integral and Zero Correlation

Secondly by establishing some boolean equations


The Bivariate Normal Distribution

Zero Correlation Implies Independence. If two random variables X and Y are jointly normal and are uncorrelated then they are independent.
Bivariate Normal


Zero Correlation Independence

https://www.tandfonline.com/doi/pdf/10.1080/00031305.1986.10475412





Correlation coefficient and p-values: what they are and why you

The p-value is a number between 0 and 1 representing the probability that this data would have arisen if the null hypothesis were true. In medical trials the 
p values


1.10.5 Covariance and Correlation

2. If random variables X1 and X2 are independent then cov(X1X2)=0. 3. var(aX1 + bX2) = 
MS NotesWeek


214261 Reminder No. 1: Uncorrelated vs. Independent

Reminder No. 1: Uncorrelated vs. Independent

36-402, Advanced Data Analysis

Last updated: 27 February 2013

A reminder of about the difference between two variables being un- correlated and their being independent. Two random variablesXandYareuncorrelatedwhen their correlation coeffi- cient is zero: (X,Y) =0 (1) Since (X,Y) =Cov[X,Y]pVar [X]Var[Y](2) being uncorrelated is the same as having zero covariance. Since Cov [X,Y]=E[XY]E[X]E[Y](3) having zero covariance, and so being uncorrelated, is the same as E [XY]=E[X]E[Y](4) One says that "the expectation of the product factors". If(X,Y)6=0, thenXandY arecorrelated. Two random variables areindependentwhen their joint probability distribution is the product of their marginal probability distributions: for allxandy, p

X,Y(x,y) =pX(x)pY(y)(5)

Equivalently

1, the conditional distribution is the same as the marginal distribution:

p

YjX(yjx) =pY(y)(6)

IfXandYare not independent, then they aredependent. If, in particular,Yis a function ofX, then they always dependent2

Thanks to Prof. Howard Seltman for suggestions.

1Why is this equivalent?

2For the sake of mathematical quibblers: anon-constantfunction ofX.

1 IfXandYare independent, then they are also uncorrelated. To see this, write the expectation of their product: E [XY]=Z Z xyp

X,Y(x,y)dxdy(7)

Z Z xyp

X(x)pY(y)dxdy(8)

Z xp X(x)Z yp

Y(y)dy

dx(9) Z xp

X(x)dxZ

yp

Y(y)dy

(10) =E[X]E[Y](11) However, ifXandYare uncorrelated, then they canstillbe dependent. To see an extreme example of this, letXbe uniformly distributed on the interval[1,1]. If X0, thenY=X, while ifXis positive, thenY=X. You can easily check for yourself that:

Yis uniformly distributed on[0,1]

E[XYjX0]=R0

1x2dx=1=3

E[XYjX>0]=R1

0x2dx= +1=3

E[XY]=0 (hint:law of total expectation).

The joint distribution ofXandYis not uniform on the rectangle[1,1] [0,1], as it would be ifXandYwere independent (Figure 1). The only general case when lack of correlation implies independence is when the joint distribution ofXandYis Gaussian. 2 x <- runif(200,min=-1,max=1) y <- ifelse(x>0,x,-x) plot(x,y,pch=16) rug(x,side=1,col="grey") rug(y,side=2,col="grey") Figure 1: An example of two random variables which are uncorrelated but strongly dependent. The grey "rug plots" on the axes show the marginal distributions of the samples fromXandY. 3

Reminder No. 1: Uncorrelated vs. Independent

36-402, Advanced Data Analysis

Last updated: 27 February 2013

A reminder of about the difference between two variables being un- correlated and their being independent. Two random variablesXandYareuncorrelatedwhen their correlation coeffi- cient is zero: (X,Y) =0 (1) Since (X,Y) =Cov[X,Y]pVar [X]Var[Y](2) being uncorrelated is the same as having zero covariance. Since Cov [X,Y]=E[XY]E[X]E[Y](3) having zero covariance, and so being uncorrelated, is the same as E [XY]=E[X]E[Y](4) One says that "the expectation of the product factors". If(X,Y)6=0, thenXandY arecorrelated. Two random variables areindependentwhen their joint probability distribution is the product of their marginal probability distributions: for allxandy, p

X,Y(x,y) =pX(x)pY(y)(5)

Equivalently

1, the conditional distribution is the same as the marginal distribution:

p

YjX(yjx) =pY(y)(6)

IfXandYare not independent, then they aredependent. If, in particular,Yis a function ofX, then they always dependent2

Thanks to Prof. Howard Seltman for suggestions.

1Why is this equivalent?

2For the sake of mathematical quibblers: anon-constantfunction ofX.

1 IfXandYare independent, then they are also uncorrelated. To see this, write the expectation of their product: E [XY]=Z Z xyp

X,Y(x,y)dxdy(7)

Z Z xyp

X(x)pY(y)dxdy(8)

Z xp X(x)Z yp

Y(y)dy

dx(9) Z xp

X(x)dxZ

yp

Y(y)dy

(10) =E[X]E[Y](11) However, ifXandYare uncorrelated, then they canstillbe dependent. To see an extreme example of this, letXbe uniformly distributed on the interval[1,1]. If X0, thenY=X, while ifXis positive, thenY=X. You can easily check for yourself that:

Yis uniformly distributed on[0,1]

E[XYjX0]=R0

1x2dx=1=3

E[XYjX>0]=R1

0x2dx= +1=3

E[XY]=0 (hint:law of total expectation).

The joint distribution ofXandYis not uniform on the rectangle[1,1] [0,1], as it would be ifXandYwere independent (Figure 1). The only general case when lack of correlation implies independence is when the joint distribution ofXandYis Gaussian. 2 x <- runif(200,min=-1,max=1) y <- ifelse(x>0,x,-x) plot(x,y,pch=16) rug(x,side=1,col="grey") rug(y,side=2,col="grey") Figure 1: An example of two random variables which are uncorrelated but strongly dependent. The grey "rug plots" on the axes show the marginal distributions of the samples fromXandY. 3