[PDF] [PDF] Math 431 – Spring 2014 Homework 9 Hand in the following problems:

Let Xi denote the indicator function of the first time i is seen on one of the five die rolls Then, by linearity of expectation and exchangeability, E[X] =E[X1] + ··· + 



Previous PDF Next PDF





$ I_ {\rm SDE} $+—An Indicator for Multi and Many-Objective

29 mar 2019 · In aggregation-based EMO algorithms, scalarizing functions are employed to map the objective values of a multiobjective problem into a single 



[PDF] Lecture 6: Subgradient Method, September 13 61 Intro to

Note: LaTeX template courtesy of UC Berkeley EECS dept The subdifferential of the indicator function at x is known as the normal cone, NC(x), of C:



[PDF] Lecture 10: March 1 101 Conditional Expectation

Note: LaTeX template courtesy of UC Berkeley EECS dept By generalizing X from an indicator function to any random variable we can get the definition of the



[PDF] Machine Learning Notation - GitHub Pages

1(x; cond) The indicator function of x: 1 if the condition is true, 0 otherwise g[f; x] A functional that maps f to f(x) Sometimes we use a function f whose argument is 



[PDF] Probabilities and random variables

sum to one) • The indicator function of an event A is the random variable defined Monthly; I was seeing how closely LATEX could reproduce the original text



List of Symbols - SpringerLink

Locally convex cone of P-valued functions on X, Characteristic function of a subset E ⊂ X, II 1 1 Symbols Related to Continuous Cone-valued Functions of the manuscript authors will be asked to prepare the final LaTeX source files ( and 



[PDF] The Comprehensive LaTeX Symbol List

8 oct 2002 · This document lists 2590 symbols and the corresponding LATEX commands that produce them Some of these symbols corresponding LATEX command to the right of each symbol A table's caption Characteristic Value



[PDF] Math 431 – Spring 2014 Homework 9 Hand in the following problems:

Let Xi denote the indicator function of the first time i is seen on one of the five die rolls Then, by linearity of expectation and exchangeability, E[X] =E[X1] + ··· + 

[PDF] indice brut 334 majoré 317 aesh

[PDF] indice brut 408 enseignant contractuel 2017

[PDF] indice de classement granulométrique

[PDF] indice de force de discipline

[PDF] indice de force udem

[PDF] indice de force ulaval 2015

[PDF] indice de force ulaval 2016

[PDF] indice de investigacion definicion

[PDF] indice de pulsatilidad doppler

[PDF] indice de resistencia renal

[PDF] indice de sensibilité exemple

[PDF] indice de sensibilité exercice

[PDF] indice de un libro infantil

[PDF] indice de un proyecto de inversion

[PDF] indice de un proyecto definicion

Math 431 { Spring 2014

Homework 9

Due: April 17th (Sections 2 and 4) or 18th (Sections 1 and 3), 2014, depending upon your section(according to the instructions of your lecturer) Please read the instructions/suggestions on the course webpage.

Hand in the following problems:

1. LetE[X] = 1; E[X2] = 3; E[XY] =4 andE[Y] = 2. Find Cov(X;2X+Y).

Solution:By the denition of covariance and linearity of expectation,

Cov(X;2X+Y) =E[X(2X+Y)]E[X]E[2X+Y]

=2E[X2] +E[XY]E[X](2E[X] +E[Y]) =2(3) + (4)1(2(1) + 2) =2:

2. Suppose you roll a fair 20-sided die 5 times. LetXdenote the dierent outcomes you see.

(For example (20, 17,18,17,3) would beX= 4). (a) Find the mean and variance ofX. Solution:We compute the mean and variance much like the die problem from homework

8. LetXidenote the indicator function of the rst timeiis seen on one of the ve die

rolls. Then, by linearity of expectation and exchangeability,

E[X] =E[X1] ++E[X20]

=20E[X1] =20P(number 1 is seen at least once in the ve rolls) =20(1P(number 1 is not seen in the ve rolls)) =20 11920
5!

To nd the variance, we use fact 7.22,

V ar(X) =V ar

20X i=1X i! 20X i=1V ar(Xi) + 2X iBy exchangeability,

V ar(X) = 20V ar(X1) + 2019Cov(X1;X2):

For the indicator function,

V ar(X1) =p(1p) =

11920

5!1920

5 The covariance is, by exchangeability andX1X2=X1\2,

Cov(X1;X2) =E[X1X2]E[X1]E[X2] =E[X1\2]E[X1]2

=P(1 and 2 are seen at least once in the 5 rolls)

P(1 is seen at least once in the 5 rolls)2

=(1P(1 or 2 are not seen in the 5 rolls)) (1P(1 is not seen in the 5 rolls))2 11820
5! 11920
5!2

Putting it all together we have,

V ar(X) = 20

11920

5!1920

5 + 20192 4 11820
5! 11920
5!23 5 (b) What is the mean and variance if you roll the dientimes? Solution:In both the mean and variance, the only numbers that depend on the number of rolls are the ves in the exponent. Thus,

E[X] = 20E[X1] = 20

11920
n !20 asn! 1. We would expect to see all 20 numbers show up if we are rolling many many times. We would expect the variance to tend to zero. We get,

V ar(X) = 20

11920
n1920 n + 2019" 11820
n 11920
n2#

Asn! 1we get,

V ar(X) = 20(10)(0) + 2019(11) = 0

3. LetZ1;Z2;:::;Znbe independent normal random variables with mean 0 and variance 1. Let

=Z21++Z2n: (a) Using thatis the sum of independent random variables, compute both the mean and variance of. Solution:For the mean we use linearity of expectation and the fact that theZiare identically distributed. FurthermoreE[Z2i] =E[Z2i]E[Zi]2=V ar(Z) = 1, because

E[Zi] = 0. Therefore we have,

E[] =nX

i=1E[Z2i] =nE[Z21] =n:

For the variance, by independence,

V ar() =nX

i=1V ar(Z2i) =nV ar(Z21):

The variance is dened as

V ar(Z21) =E[Z41]E[Z21]2:

We computed the fourth moment of a standard normal random variable in homework 4 (number 4),E[Z41] = 3. Thus,

V ar() =nV ar(Z21) =n(31) = 2n:

(b) Find the moment generating function ofand use it to compute the mean and variance of. Solution:The moment generating function ofis dened to be

E[et] =E[et(Z21+Z22++Z2n)]:

By independence ofZiwe use fact 7.13, to write the right hand side as a product of moment generating function. Since theZiare identically distributed, then it is a product of the same moment generating function. That is, M (t) =E[et] =E[et(Z21)]n=MZ21(t)n: We now compute the moment generating function ofZ21using the density of the standard normal distribution. We have

E[etZ21] =1p2Z

1 1 etz2ez2=2dz 1p2Z 1 1 e(2t1)z2=2dz: This integral convergences only fort <1=2. Using the substitution ~z=p12tzwe have

E[etZ21] =1p2Z

1 1 e(2t1)z2=2dz

1p12t1p2Z

1 1 e~z2=2d~z

1p12tfort <1=2:

Therefore,

M (t) =(12t)n=2fort <1=2

1fort1=2:

Using the moment generating function we calculate the mean to be

E[] =M0(0) =n:

For the variance, we calculate the second moment,

E[2] =M00(0) =n(n2) =n(n2):

The variance is

V ar() =E[2]E[]2=n22nn2= 2n:

4. Let (X;Y) be a uniformly distributed random point on the quadrilateralDwith vertices

(0;0);(2;0);(1;1);and (0;1). Calculate the covariance ofXandY. Solution:To calculate the covariance we need to calculate

E[XY]; E[X]; E[Y]:

First the joint distribution of (X;Y) on the quadrilateral, noting that the area is 3/2, is p

X;Y(x;y) =

23
;(x;y)2D

0;(x;y)=2D:

Then integrating we have,

E[XY] =Z

1 0Z 2y 023
xy dx dy Z 1 026
y(2y)2 26
42
y243 y3+14 y4 1 0 =26 1112
=1136

E[X] =Z

1 0Z 2y 023
x dx dy Z 1 026
(2y)2 26
4y42 y2+13 y3 1 0 =26 73
=79

E[Y] =Z

1 0Z 2y 023
y dx dy Z 1 026
(2y)y 23
22
y213 y3 1 0 =23 23
=49

By the denition of covariance, we have

Cov(X;Y) =E[XY]E[X]E[Y] =1136

79
49
=1136 2881

5. Let the joint pmf of (X;Y) be given by the table below.

Y X0123 11 151
152
151
15 21
101
101
51
10 31
301
3001
10 (a) Find Cov(X;Y).

Solution:To nd the covariance, we compute

E[XY]; E[X]; E[Y]:

Thus we have

E[XY] =3X

i=13 X j=0ijp(x=i;y=j) =0 115
+110
+130
+ 1115 + 2110 +215
+3130
+115
+ 415 + 6110 2:233 ForE[X] andE[Y] we can use the marginal pmfs, by summing over the columns or rows, to simplify the computations. f X(x) X11 3 21
2 31
6 Y f

Y(y)0123

11 51
51
34
15

E[X] =3X

i=13 X j=0ip(x=i;y=j) 3X i=1ip X(i) 13 + 212 + 316 =116 1:833

E[Y] =3X

j=03 X i=1jp(x=i;y=j) 3X j=0jp Y(j) =015 + 115 + 213 + 3415 1:667

Using the above calculations, the covariance is,

Cov(X;Y) =E[XY]E[X]E[Y] = 2:233(1:833)(1:66)

(b) Find Corr(X;Y). Solution:To nd the correlation coecient, we need the covariance from above as well as the variance ofXandY. To nd the variance, we need the second moments.

E[X2] =3X

i=13 X j=0i

2p(x=i;y=j)

3X i=1i

2pX(i)

13 + 2212 + 3216

E[Y2] =3X

j=03 X i=1j

2p(x=i;y=j)

3X j=0j

2pY(j)

=0 215
+ 1215 + 2213 + 324:

The correlation coecient is dened as,

Corr(X;Y) =Cov(X;Y)pV ar(X)pV ar(Y)=2:233(1:833)(1:66)p::::::: p::::::

6. LetXbe uniformly distributed on [a;a] fora >0 andY=X2. Show thatXandYare

uncorrelated, even thoughYis a function ofX. Solution:Xis uniformly distributed on [a;a] and therefore has the probability density function p

X(x) =

12a; x2[a;a]

0; x =2[a;a]:

To show thatXandYare uncorrelated, we must show thatCov(X;Y) = 0, or

Cov(X;Y) =E[XY]E[X]E[Y] =E[X3]E[X]E[X2] = 0

We compute the third moment ofXusing the density function,

E[X3] =Z

1 1 x3pX(x)dx Z a ax 32adx
(a)4(a)48a =0: Because 1=2ais constant inx, and therefore symmetric aboutx= 0, then every odd moment ofXwill be zero. That is,

E[X] =E[X3] ==E[X2n+1] = 0

for everyn= 0;1;2;3;:::. Thus Cov(X;Y) =E[XY]E[X]E[Y] =E[X3]E[X]E[X2] = 00E[X2] = 0:

Therefore,XandYare uncorrelated.

7. LetIAbe the indicator of the eventA. Show that for anyA,Bwe have

Corr(IA;IB) = Corr(IAc;IBc):

Solution:We start with the left hand side of the above equation. We start with the denition of covariance,

Cov(IA;IB) =E[IAIB]E[IA]E[IB] =P(A\B)P(A)P(B):

Next we use the identity 1P(Ac\Bc) =P(A[B) =P(A)+P(B)P(A\B). This yields,

Cov(IA;IB) =P(A) +P(B)(1P(Ac\Bc))P(A)P(B):

Next we substituteP(A) = 1P(Ac) andP(B) = 1P(Bc) to get

Cov(IA;IB) =P(A) +P(B)P(Ac\Bc)(1P(Ac))(1P(Bc)):

Simplifying yields,

Cov(IA;IB) =P(Ac\Bc)P(Ac)P(Bc) =E[IAcIBc]E[IAc]E[IBc] =Cov(IAc;IBc) To nish the problem, we must show that the variances are equal. For any indicator function we have V ar(IA) =E[I2A]E[IA]2=P(A)P(A)2=P(A)(1P(A)) =P(A)P(Ac): Thus

V ar(IA) =V ar(IAc);

and therefore,

Corr(IA;IB) =Corr(IAc;IBc):

8. Suppose that for the random variableX;Ywe haveE[X] = 2,E[Y] = 1,E[X2] = 5,

quotesdbs_dbs13.pdfusesText_19