Given a continuous random variable X and its PDF fX(x), its expected value is defined as follows:
Quite often, we use the short-hand notation μX = E[X]. Again, this is straightforward extension of the discrete case, where E[X] ≡ ∑i xipX(xi).
Example 7.2 As an illustration, let us consider the expected value of a uniform random variable on [a, b]. Symmetry suggests that the expected value should be the midpoint of the support. Indeed
By the same token, we define variance of a continuous random variable as:
Common shorthand notations for variance are σ2 and ; its square root σX is standard deviation. More generally, we define the expected value of a function g(X) of a random variable as
The considerations we made about expected values of discrete random variables apply here as well. Since integration is a linear operator,4 just like the sum, expectation is linear in the continuous case, too. All of the properties of expectation and variance, that we have introduced for the discrete case, carry over to the continuous case. In particular, we recall the following very useful properties:
where α and β are arbitrary real numbers.
Leave a Reply