Category: Discrete Random Variables
-
Poisson distribution
The Poisson random variable arises naturally when we have to count the number of events occurring over a specific time interval. We see that this kind of distribution is intimately related to exponential random variables, which are dealt with in Section 7.6.3, and with the Poisson stochastic process, introduced in Section 7.9. For now, the best way…
-
Binomial distribution
The binomial distribution arises as yet another variation on Bernoulli trials. We run n independent and identical experiments and let X be a random variable counting the number of successes. The support of the resulting random variable is {1, 2,…, n}, and its probability distribution depends on two parameters: the probability of success p and the number of experiments n. Since events are independent, it…
-
Geometric distribution
The geometric distribution is a generalization of the Bernoulli random variable. The underlying conceptual mechanism is the same, but the idea now is repeating identical and independent Bernoulli trials until we get the first success. The number of experiments needed to stop the sequence is a random variable X, with unbounded support 1, 2, 3,…. Finding its PMF…
-
Bernoulli distribution
The Bernoulli distribution is based on the idea of carrying out a random experiment, which may result in a success or a failure. Let p be the probability of success; then, 1 − p is the probability of failure. If we assign the value 1 to variable X in case of success, and 0 otherwise, we get the following PMF: It…
-
Discrete uniform distribution
The uniform distribution is arguably the simplest model of uncertainty, as it assigns the same probability to each outcome: This makes sense only if there is a finite number n of possible values that the random variable can assume. If they are consecutive integer numbers, we have an integer uniform distribution, which is characterized by the lower…
-
Empirical distributions
Empirical distributions feature the closest link with descriptive statistics, since their PMF is typically estimated by collecting empirical relative frequencies. For instance, if we consider a sample of 10 observations of a random variable X, and X = 1 occurs in three cases, X = 2 in five cases, and X = 3 occurs twice, we may estimate Empirical distributions feature the largest…
-
A FEW USEFUL DISCRETE DISTRIBUTIONS
There is a wide family of discrete probability distributions, which we cannot cover exhaustively. Nevertheless, we may get acquainted with the essential ones, which will be illustrated by a few examples. First, we should draw the line between empirical and theoretical distributions. Since these terms may be a tad misleading, it is important to clarify…
-
Properties of variance
The first thing we should observe is that variance cannot be negative, as it is the expected value of a squared deviation. It is zero for a random variable that is not random at all, i.e., a constant. In doing calculations, the following identity is quite useful: This is the analog of Eqs. (4.5) and (4.6) in…
-
VARIANCE AND STANDARD DEVIATION
The expected value of a random variable tells us something about the location of its distribution, but we need a characterization of dispersion and risk as well. In descriptive statistics, we consider squared deviations with respect to the mean. Here we do basically the same thing, with respect to the expected value. DEFINITION 6.9 (Variance…
-
Expected value of a function of a random variable
Typically, a random variable is just a risk factor that will affect some managerially more relevant outcome linked to cost or profit. This link may be represented by a function; hence, we are interested in functions of random variables. Given a random variable X and a function, like g(x) = x2, or g(x) = max{x, 0}, we define a new…