[PDF] Chapter 5. Multiple Random Variables 5.4: Covariance and





Previous PDF Next PDF



Partial and Multiple Correlation for Time Series

partial and multiple correlation. Partial correlation is defined here as the ordinary correlation between two random variables after.



The correlation of multiple intelligences for the achievements of

academic performance achievement levels of high school students based on Gardner's multiple intelligences theory. This was a descriptive correlation study.





Partial and Multiple Correlation for Time Series

partial and multiple correlation. Partial correlation is defined here as the ordinary correlation between two random variables after.



Chapter 5. Multiple Random Variables 5.4: Covariance and

We will start with the definition of covariance: Cov (X Solution To find the correlation



Introduction à la régression multiple

La quantité. R est encore appelée coefficient de corrélation multiple entre Y et les variables explicatives c'est le coefficient de corrélation usuel entre 



2.4.3 Le coefficient de corrélation multiple (ou coefficient de

DEF: déformation (en mm) mesurée au repère (la variable Y de la régression). On dispose d'un total de. 1158 mesures. T3: température moyenne au cours des trois 



Correlation in MIMO Antennas

16 Apr 2020 However the correlation has several different definitions and calculation methods



Minimum capital requirements for Market Risk

(iii) Default Risk Charge for securitisations (correlation trading portfolio) . The worst loss of the two scenarios is the risk position (defined in ...



Multiple Regression with Serial Correlation

Note it was calculated from the transformed data on the last iteration. Other than this the report has the same definitions as in regular Multiple Regression.



[PDF] 243 Le coefficient de corrélation multiple (ou coefficient de

Le coefficient de corrélation multiple noté R2 représente la proportion de la variance Vous servant de la définition de R2 et des résultats précédents 



[PDF] Cours 12 : Corrélation et régression

Un coefficient de corrélation multiple s'interprète de la même façon qu'un r régulier dans le cas d'un problème à deux variables De plus il est aussi possible 



[PDF] Le rapport de corrélation multiple et ses applications - Numdam

Dans cet article on introduit le rapport de corrélation multiple qui généralise à k (k > 2) caractères le rapport de corrélation de Pearson Particulièrement on 



[PDF] Introduction à la régression multiple

La quantité R est encore appelée coefficient de corrélation multiple entre Y et les variables explicatives c'est le coefficient de corrélation usuel entre 



Coefficient de corrélation multiple : définition - AquaPortail

1 août 2007 · Un coefficient de corrélation multiple (R) donne le degré maximal de relation linéaire qui peut être obtenu entre deux ou plusieurs variables 



[PDF] Régressions et corrélations multiples en hydrologie - Horizon IRD

7 - DÉFINITIONS ET NOTATIONS 8 - RÉGRESSION LINÉAIRE ET CORRÉLATION AVEC DEUX VARIABLES 9 - RÉGRESSION LINÉAIRE MULTIPLE (k > 2)



[PDF] Régression multiple - Free

explicatives peut se mesurer par un coefficient de "corrélation multiple" défini comme la racine carrée du coefficient de détermination R2 Par définition 



[PDF] UNIT 11 MULTIPLE CORRELATION - eGyanKosh

Multiple correlation coefficient is the simple correlation coefficient between a variable and its estimate Let us define a regression equation of 1



[PDF] Régression multiple : principes et exemples dapplication

explicative dont les valeurs ne changent pas par définition (figure A8) Le coefficient de corrélation multiple est alors donnée par :



[PDF] Corrélations : explications et limites - PEPI IBIS

La corrélation de Pearson entre X et Y est définie par ?1 ? Cov(XY) Le coefficient de corrélation multiple est alors la valeur maximale prise

  • Comment calculer le coefficient de corrélation multiple ?

    Le coefficient de corrélation multiple correspond au coefficient de corrélation entre les valeurs réelles de la variable aléatoire dépendante et les valeurs estimées par l'équation de régression. En résumé, le coefficient de corrélation multiple R est le cosinus de l'angle ? fait par y et y^.1 août 2007
  • Quels sont les différents types de corrélation ?

    De façon générale, on va parler de corrélation linéaire ou non-linéaire. Pour une corrélation linéaire, on va y rattacher le concept de droite de régression. Du côté du sens, on définit une corrélation positive lorsque les deux ensembles varient dans le même sens.
  • Quand utiliser la régression linéaire multiple ?

    L'analyse par régression linéaire multiple est une des solutions qui existe pour observer les liens entre une variable quantitative dépendante et n variables quantitatives indépendantes.
  • Équation de régression multiple
    Le nombre de variables indépendantes peut croître jusqu'à n et la constante b avec chaque variable indique sa valeur numérique. Le but de la constante a est de désigner la valeur de la variable dépendante dans le cas où toutes les valeurs de la variable indépendante tournent à zéro.

Chapter 5. Multiple Random Variables

5.4: Covariance and Correlation

Slides (Google Drive)

Alex Tsun

Video ( YouTube)In this section, we'll learn about covariance; which as you might guess, is related to variance. It is a function

of two random variables, and tells us whether they have a positive or negative linear relationship. It also

helps us nally compute the variance of a sum ofdependentrandom variables, which we have not yet been able to do.

5.4.1 Covariance and Properties

We will start with the denition of covariance:Cov(X;Y) =E[(XE[X])(YE[Y])]. By LOTUS, we know this is equal to (whereX=E[X] andY=E[Y]) X xX y(xX)(yY)pX;Y(x;y) Intuitively, we can see the following possibilities: x > X;y > Y)(xX)(yY)>0 (X;Yboth above their means) x < X;y < Y)(xX)(yY)>0 (X;Yboth below their means) x < X;y > Y)(xX)(yY)<0 (Xbelow its mean,Yabove its mean) x > X;y < Y)(xX)(yY)<0 (Xabove its mean,Ybelow its mean)

So we get a weighted average (bypX;Y) of these positive or negative quantities. Just with this brief intuition,

we can say that covariance is positive whenX;Yare usually both above/below their means, and negative if

they are opposite. That is, covariance is positive in general when increasing one variable leads to an increase

in the other, and negative when increasing one variable leads to a decrease in the other.Denition 5.4.1: Covariance

LetX;Ybe random variables. ThecovarianceofXandYis:

Cov(X;Y) =E[(XE[X])(YE[Y])] =E[XY]E[X]E[Y]

This should remind you of the denition of variance - think of replacingYwithXand you'll see it! Note: Covariance can be negative, unlike variance.

Covariance satises the following properties:

1. If X?Y, thenCov(X;Y) = 0 (but not necessarily vice versa, because the covariance could be zero butXandYcould not be independent).

2.Cov(X;X) =Var(X). (Just plug inY=X).1

2Probability & Statistics with Applications to Computing 5.43.Cov(X;Y) =Cov(Y;X). (Multiplication is commutative).

4.Cov(X+c;Y) =Cov(X;Y). (Shifting doesn't and shouldn't aect the covariance).

5.Cov(aX+bY;Z) =aCov(X;Z) +bCov(Y;Z). This can be easily remembered like the

distributive property of scalars (aX+bY)Z=a(XZ) +b(Y Z).

6.Var(X+Y) =Var(X) +Var(Y) + 2Cov(X;Y), and hence ifX?Y, thenVar(X+Y) =

Var(X) +Var(Y) (as we discussed earlier).

7.CovPn

i=1Xi;Pm j=1Yi =Pn i=1P m j=1Cov(Xi;Yj). That is covariance works like FOIL (rst,

outer, inner, last) for multiplication of sums ((a+b+c)(d+e) =ad+ae+bd+be+cd+ce).Proof of Covariance Alternate Formula.We will prove thatCov(X;Y) =E[XY]E[X]E[Y].

Cov(X;Y) =E[(XE[X])(YE[Y])] [def of covariance]

=E[XYE[X]YXE[Y] +E[X]E[Y]] [algebra] =E[XY]E[X]E[Y]E[X]E[Y] +E[X]E[Y] [Linearity of Expectation] =E[XY]E[X]E[Y] [algebra]Proof of Property 1: Covariance of Independent RVs is0. We actually proved in 5.1 already thatE[XY] =E[X]E[Y] whenX;Yare independent. Hence, Cov(X;Y) =E[XY]E[X]E[Y] = 0Proof of Property 6: Variance of Sum of RVs. We will show that in general, for any RVsXandY, that

Var(X+Y) =Var(X) +Var(Y) + 2Cov(X;Y)

Var(X+Y) =Cov(X+Y;X+Y) [covariance with self = variance] =Cov(X;X) +Cov(X;Y) +Cov(Y;X) +Cov(Y;Y) [covariance like FOIL] =Var(X) + 2Cov(X;Y) +Var(Y) [covariance with self, and symmetry]Example(s) LetXandYbe two independentN(0;1) random variables and:

Z= 1 +X+XY2

W= 1 +X

Find Cov(Z;W).

5.4 Probability & Statistics with Applications to Computing3

SolutionFirst note thatEX2=Var(X) +E[X]2= 1 + 02= 1 (rearrange variance formula and solve for

EX2). Similarly,EY2= 1.

Cov(Z;W) =Cov1 +X+XY2;1 +X

=CovX+XY2;X[Property 4] =Cov(X;X) +CovXY2;X[Property 7] =Var(X) +EX2Y2EXY2E[X] [Property 2 and def of covariance] = 1 +EX2EY2E[X]2EY2[BecauseXandYare independent] = 1 + 10 = 25.4.2 (Pearson) Correlation

Covariance has a \problem" in measuring linear relationships, in thatCov(X;Y) will be positive when there

is a positive linear relationship and negative when there is a negative linear relationship, butCov(2X;Y) =

2Cov(X;Y). Scaling one of the random variables should not aect thestrengthof their relationship, which

it seems to do. It would be great if we dened some metric that was normalized (had a maximum and

minimum), and was invariant to scale. This metric will be called correlation!Denition 5.4.2: (Pearson) Correlation

LetX;Ybe random variables. The (Pearson) correlation ofXandYis: (X;Y) =Cov(X;Y)pVar(X)pVar(Y) We can prove by the Cauchy-Schwarz inequality (from linear algebra),1(X;Y)1. That is, correlation is just a normalized version of covariance. Most notably,(X;Y) =1 if and only if Y=aX+bfor some constantsa;b2R, and then the sign ofis the same as that ofa. In linear regression ("line-tting") from high school science class, you may have calculated someR2,

0R21, and this is actually2, and measure how well a linear relationship exists betweenXand

Y.R2is the percentage of variance inYwhich can be explained byX.Let's take a look at some example graphs which shows a sample of data and their (Pearson) correlations, to

get some intuition.

4Probability & Statistics with Applications to Computing 5.4The 1st (purple) plot has a perfect negative linear relationship and so the correlation is1.

The 2nd (green) plot has an positive relationship, but it is not perfect, so the correlation is around +0:9.

The 3rd (orange) plot is a perfectly linear positive relationship, so the correlation is +1. The 4th (red) plot appears to have data that is independent, so the correlation is 0.

The 5th (blue) plot has a negative trend that isn't strongly linear, so the correlation is around0:6.Example(s)

SupposeXandYare random variables, whereY=5X+ 2. Show that, since there is a perfect

negative linear relationship,(X;Y) =1.SolutionTo nd the correlation, we need the covariance and the two individual variances. Let's write them

in terms ofVar(X).

Var(Y) =Var(5X+ 2) = (5)2Var(X) = 25Var(X)

By properties of covariance (shifting by 2 doesn't matter),

Cov(X;Y) =Cov(X;5X+ 2) =5Cov(X;X) =5Var(X)

Finally,

(X;Y) =Cov(X;Y)pVar(X)pVar(Y)=5Var(X)pVar(X)p25Var(X)=5Var(X)5Var(X)=1

Note that the5 and 2 did not matter at all (except that5 was negative and made the correlation negative)!5.4.3 Variance of Sums of Random Variables

Perhaps the most useful application of covariance is in nding the variance of a sum ofdependentrandom

variables. We'll extend the case ofVar(X+Y) to more than two random variables.

5.4 Probability & Statistics with Applications to Computing5Theorem 5.4.1: Variance of Sums of RVs

IfX1;X2;:::;Xnare random variables, then

Var nX i=1X i! =nX i=1Var(Xi) + 2X

i complicated. The variance of the sumX1+X2++Xnis the covariance with itself! We'll useito index one of the sumsPn i=1Xiandjfor the otherPn j=1Xi. Keep in mind these both represent the same quantity; you'll see why we used dierent dummy variables soon! Var nX i=1X i! =Cov0 nX i=1X i;nX j=1X j1 A [covariance with self = variance] nX i=1n X j=1Cov(Xi;Xj) [by FOIL] nX i=1Var(Xi) + 2X icovariance. It is illustrated below where the red diagonal is the covariance of a variable with itself (which

is its variance), and the green o-diagonal are the symmetric pairs of covariance. We used the fact that

Cov(Xi;Xj) =Cov(Xj;Xi) to require us to only sum the lower triangle (wherei < j), and multiply by 2 to

account for the upper triangle.It is important to remember than if all the RVs were independent, all theCov(Xi;Xj) terms (fori6=j)

would be zero, and so we would just be left with the sum of the variances as we showed earlier!

6Probability & Statistics with Applications to Computing 5.4Example(s)

Recall in the hat check problem in 3.3, we hadnpeople who go to a party and leave their hats with a hat check person. At the end of the party, the hats are returned randomly though. We letXbe the number of people who get their original hat back. We solved forE[X] with indicator random variablesX1;:::Xnfor whether thei-th person got their hat back.

We showed that:

E[Xi] =P(Xi= 1)

=Pithperson get their hat back 1n So,

E[X] =E"

nX i=1X i# nX i=1E[Xi] nX i=11n =n1n = 1 Above was all review: now computeVar(X).SolutionRecall that eachXiBer1n (1 with probability1n , and 0 otherwise). (Remember these were

NOT independent RVs, but we still could apply linearity of expectation.) In our previous proof, we showed

that

Var(X) =Var

nX i=1X i! =nX i=1Var(Xi) + 2X iRecall thatXi;Xjare indicator random variables which are inf0;1g, so their productXiXj2 f0;1gas well.

This allows us to calculate:

E[XiXj] =P(XiXj= 1) [since indicator, is just probability of being 1] =P(Xi= 1;Xj= 1) [product is 1 if and only if both are 1] =P(Xi= 1)P(Xj= 1jXi= 1) [chain rule] 1n 1n1

5.4 Probability & Statistics with Applications to Computing7

This is because we need both personiand personjto get their hat back: personigets theirs back with probability 1n , andgiventhis is true, personjgets theirs back with probability1n1 So, by denition of covariance (recall eachE[Xi] =1n

Cov(Xi;Xj) =E[XiXj]E[Xi]E[Xj]

1n 1n1 1n 1n [plug in] nn

2(n1)n1n

2(n1)[algebra]

1n

2(n1)[algebra]

Further, sinceXiis a Bernoulli (indicator) random variable:

Var(Xi) =p(1p) =1n

11n

Finally, we have

Var(X) =nX

i=1Var(Xi) + 2X i2(n1)[plug in] =n1n 11n + 2n 2 1n 2(n1) [there aren 2 pairs withi < j] 11n + 2n(n1)2 1n 2(n1) 11n +1n = 1

How many pairs are their withi < j? This is justn

2=n(n1)2

since we just choose two dierent elements.quotesdbs_dbs35.pdfusesText_40
[PDF] corrélation multiple spss

[PDF] coefficient de détermination multiple excel

[PDF] definition fonction de cout total

[PDF] corrélation entre plusieurs variables excel

[PDF] corrélation multiple excel

[PDF] fonction de cout marginal

[PDF] régression multiple excel

[PDF] cours microeconomie

[PDF] microéconomie cours 1ere année pdf

[PDF] introduction ? la microéconomie varian pdf

[PDF] introduction ? la microéconomie varian pdf gratuit

[PDF] les multiples de 7

[PDF] les multiples de 8

[PDF] comment reconnaitre un multiple de 4

[PDF] numero diviseur de 4