[PDF] [PDF] Discrete Random Variables, I - Illinois

Informally, a random variable is a quantity X whose value depends on some Probability mass function (p m f ) (also called a “discrete density function”, or,



Previous PDF Next PDF





[PDF] Discrete random variables - UConn Undergraduate Probability OER

(i) E[X + Y ] = EX + EY , (ii) E[aX] = aEX, as long as all expectations are well- defined PROOF Consider a random variable Z := X + Y which is a discrete random 



[PDF] 11 Discrete Random variables

If X is a random variable on S and g : R → R then Y = g(X) is a new random variable on X which maps S into R by Y (s) = g(X(s)) for all outcomes s ∈ S For example Y = 2X − 7 or Z = X2 are both new random variables on S Let X be a discrete random variable



[PDF] Probability Review - Discrete Random Variables

25 sept 2019 · P[X ∈ {1, 2, 3}] Random variables are usually divided into discrete and continuous, even The support SX of the discrete random variable X is more rigorous way of showing that (1 5 4) is correct is to evaluate the sums



[PDF] Discrete Random Variables and Probability Distributions

Can we show that the two equations for variance are equal? V (x) = σ 2 = E(X 2 ) 



[PDF] Chapter 2 Discrete random variables - CERMICS

Let α ∈ R We say that a discrete random variable X follows the degen- Proof Let us show that X1 + ··· + Xn ∼ B(n, p) Let k ∈ {0,··· ,n} By σ-additivity,



[PDF] RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS 11

serve as the probability distribution for a discrete random variable X if and only if it s Proof for case of finite values of X Consider the case where the random 



[PDF] 3 Discrete Random Variables - EPFL

where p is the probability that a coin shows a head Write down the sample space and the sets Ax when n = 3 What is the random variable X = I1 + ··· + In?



[PDF] Discrete Random Variables I - David Dalpiaz

Informally, a random variable is a quantity X whose value depends on some Probability mass function (p m f ) (also called a “discrete density function”, or,



[PDF] 2 Discrete Random Variables - Arizona Math

Definition 1 A discrete random variable X on a probability space (Ω, F, P) Proof The proof of the theorem is trivial First note that if replace A by its intersection 



[PDF] Discrete Random Variables, I - Illinois

Informally, a random variable is a quantity X whose value depends on some Probability mass function (p m f ) (also called a “discrete density function”, or,

[PDF] show that x is a markov chain

[PDF] show that x is a random variable

[PDF] show that [0

[PDF] show the mechanism of acid hydrolysis of ester

[PDF] show time zone cisco

[PDF] show ∞ n 2 1 n log np converges if and only if p > 1

[PDF] shredded workout plan pdf

[PDF] shredding diet

[PDF] shredding workout plan

[PDF] shrm furlough

[PDF] shuttle paris

[PDF] si clauses french examples

[PDF] si clauses french exercises

[PDF] si clauses french practice

[PDF] si clauses french practice pdf

Math 408, Actuarial Statistics I A.J. Hildebrand

Discrete Random Variables, I

Terminology

Informally, arandom variableis a quantityXwhose value depends on some random event. The space (or range) ofXis the setSof possible values ofX. If this setSis finite or countable (i.e., can be listed as a sequencex1,x2,...), the random variable is calleddiscrete.

General formulas

•Probability mass function (p.m.f.) (also called a "discrete density function", or, somewhat less precisely, a "discrete distribution"): - Definition and notation:f(x) =P(X=x) for allx?S. - Properties:(1)f(x)≥0, (2)? x?Sf(x) = 1 - Uniform distribution on a setS:Each of the valuesx?Shas the same probability, i.e.,f(x) = 1/nfor each valuex, wherenis the number of values. •Expectation (mean): - Definition and notation:μ= E(X) =? x?Sxf(x) - Properties:E(c) =c, E(cX) =cE(X), E(X+Y) = E(X) + E(Y) - Expectation of a function ofX:E(u(X)) =? x?Su(x)f(x) •Variance: - Definition and notation:σ2= Var(X) = E(X2)-E(X)2 - Alternate definition:Var(X) = E((X-μ)2) - Properties:Var(c) = 0, Var(cX) =c2Var(X), Var(X+c) = Var(X). - Standard deviation:σ=?Var(X) •Moment-generating function: - Definition and notation:M(t) = E(etX) =? x?Setxf(x) - Properties:The derivatives ofM(t) at 0 generate the "moments" ofX:M?(0) = E(X), M ??(0) = E(X2),M???(0) = E(X3), etc. 1

Math 408, Actuarial Statistics I A.J. Hildebrand

Notes and tips

•Always specify the set of valuesxof a p.m.f.:A formula alone, such asf(x) =p(1-p)x, is useless without knowing the set of "legal" valuesxfor this function. You can specify the values, by either listing all of them explicitly (e.g., "x= 2,3,5,7,11"), or with notational shortcuts (e.g., "x= 0,1,...,n-1,n", or "n= 1,2,3,...") if there is a clear pattern in these values. In the case of standard distributions (e.g., the geometric distribution given by the formulaf(x) =p(1-p)x-1), make sure to memorize the exact range of valuesx, along with the formula forf(x), and write it down whenever you use the formula. This helps avoid making mistakes, such as having a summation run from 1 to infinity instead of 0 to infinity, or vice versa, down the line. •Distinguish between capitalX, denoting an abstract random variable, and lower casex, denoting the values of such a random variable:In a formula likeE(X) =? x?Sxf(x), theXon the left refers to the abstract random variableX(an expectation is associated with arandom variable), whereas thexon the right refers to the values of this random variable (the p.m.f.f(x) is a function of thevaluesof a random variable).A notation likeE(x)orf(X)is mathematically nonsensical, you may get penalized in exams for using such notation! •Expectation, variance, etc. are just ordinary nonrandom numbers, not functions: It is important to keep in mind thatE(X), Var(X), etc., are just ordinarynumbersassociated with a random variableX, Taking the expectation of a random variableXcompletely "re- moves" the randomness fromXand produces an ordinary number. Despite the function-like notation,E(X) and Var(X) are not functions in the usual (calculus) sense. •Integral rule of thumb for epectations:An easy way to visualize the meaning of expec- tation, and to remember its properties, is with this rule of thumb: Think of a random variable as being a function on a unit interval (for example, the stock price charts that you see in the Wallstreet Journal). Then its expectation represents the integral over this function. This rule of thumb is analogous to the "area rule of thumb" for probabilities. whereXis a random variable, are, strictly speaking, mathematically nonsensical (since one applies a probability functionP(...) to anequation or inequalityrather than aset), but they are commonly used and convenient shortcuts to express what would otherwise require a very clumsy formulation: For example, "P(X= 2)" is shorthand for the following: "P(A), where Adenotes the event that the random variableXis equal to 2". In the latter form the notation makes perfect sense, since the eventAdefined there corresponds to a subset of the sample space and thus has a well-defined associated probabilityP(A). An easy way to obtain the correct interpretation of such notations is to simply read off the formula aloud, while translating each symbol into words. For example, ifXdenotes the number of heads in a series of coin tosses, then "P(X≥2)" translates into "the probability that the number of heads is greater or equal to two." To compute such probabilities, simply add up corresponding values of the p.m.f. For example, f(1) +f(2) +f(3). 2quotesdbs_dbs14.pdfusesText_20