Skip to content
Sahithyan's S2
Sahithyan's S2 — Methods of Mathematics

Discrete Probability Distribution

Probability Mass Function

Denoted by PP. Gives the probability that a discrete random variable XX is exactly equal to some value xx.

xP(x)=1\sum_{x} P(x) = 1

Parameters

Mean

E(X)=μ=ixiP(X=xi)E(X) = \mu = \sum_{i} x_i \cdot P(X = x_i)

Here:

  • xix_i represents each possible value of XX
  • P(X=xi)P(X = x_i) is the probability of observing that value

Variance

Var(X)=σ2=i(xiμ)2P(X=xi)\text{Var}(X) = \sigma^2 = \sum_{i} (x_i - \mu)^2 \cdot P(X = x_i)

An equivalent computational formula is:

Var(X)=ixi2P(X=xi)μ2\text{Var}(X) = \sum_{i} x_i^2 \cdot P(X = x_i) - \mu^2

Cumulative Distribution Function

F(x)=txP(X=t)F(x) = \sum_{t \leq x} P(X = t) F(x)=F(x)    for xZcF(x) = F(\lfloor x \rfloor)\;\;\text{for } x \in \mathbb{Z}^{c}

Example

Consider a discrete random variable XX with the following probability distribution:

XP(X)
10.2
20.3
30.4
40.1

The mean would be: E(X)=10.2+20.3+30.4+40.1=2.4E(X) = 1 \cdot 0.2 + 2 \cdot 0.3 + 3 \cdot 0.4 + 4 \cdot 0.1 = 2.4.

The variance would be: Var(X)=(12.4)20.2+(22.4)20.3+(32.4)20.4+(42.4)20.1=0.84\text{Var}(X) = (1-2.4)^2 \cdot 0.2 + (2-2.4)^2 \cdot 0.3 + (3-2.4)^2 \cdot 0.4 + (4-2.4)^2 \cdot 0.1 = 0.84.

Types

Uniform distribution

A type of discrete probability distribution where all outcomes are equally likely. If a random variable XX can take nn distinct values, each value has a probability of P(X=x)=1nP(X = x) = \frac{1}{n}.

Bernoulli distribution

A type of discrete probability distribution where there are only two possible outcomes, often referred to as “success” and “failure”. If a random variable XX can take values 0 or 1, with P(X=1)=pP(X = 1) = p and P(X=0)=1pP(X = 0) = 1 - p, then XX follows a Bernoulli distribution.

PMF of Bernoulli distribution:

P(x)=px(1p)1x    ;  x=0  or  1P(x) = p^x (1-p)^{1-x} \;\;;\; x = 0\;\text{or}\;1

Here:

  • pp is the probability of success

Binomial distribution

When an experiment consists of nn independent repeated Bernoulli trials, the experiment is said to be a binomial experiment. The number of successes (XX) follows a binomial distribution. And has a PMF:

P(X)=(nx)px(1p)nx    ;  x=0,1,,nP(X) = \binom{n}{x} p^x (1-p)^{n-x}\;\;;\; x = 0, 1, \ldots, n

Here:

  • nn is the number of trials.
  • θ\theta is the probability of success.

Assumptions made in binomial distribution:

  • Each trial is independent.
  • Each trial has only two possible outcomes.
  • The probability of success is constant across trials.
  • The number of trials is fixed.

For a large sample of a binomial distribution:

μ=np        σ=np(1p)\mu = np \;\; \land \;\; \sigma = \sqrt{np(1-p)}

Poisson distribution

If XX is the number of independent successes occur within a fixed time interval, then XX is said to follow a Poisson distribution. And has a PMF:

P(x)=eλλxx!    ;  x=0,1,2,P(x) = \frac{e^{-\lambda} \lambda^x}{x!}\;\;;\; x=0,1,2,\ldots

Here:

  • λ\lambda - average number of successes occurred within the fixed time interval
E(X)=λ=Var(X)\text{E}(X) = \lambda = \text{Var}(X)

If XiPoisson(λi)X_i \sim \text{Poisson}(\lambda_i) where i=1,2,,ni = 1, 2, \ldots, n, and XiX_i is independent then:

i=1nXiPoisson(i=1nλi)\sum_{i=1}^n X_i \sim \text{Poisson}\bigg( \sum_{i=1}^n \lambda_i \bigg)