Skip to content
Sahithyan's S2
Sahithyan's S2 — Methods of Mathematics

Joint Distributions

Specifies the probability of observing a combination of values for two or more random variables. Characterizes the relationship between multiple random variables, including their dependencies and correlations.

Definition

For two random variables and , the joint probability distribution gives the probability that and simultaneously take on specific values. This can be expressed as:

  • For discrete random variables: or
  • For continuous random variables:

For a joint probability distribution , and are the marginal probabilities.

Properties

Non-negativity

  • Discrete case:
  • Continuous case:

Total probability equals 1

  • Discrete case:
  • Continuous case:

Marginal distributions

The distribution of an individual variable can be derived from the joint distribution:

  • Discrete case:
  • Continuous case:

Conditional distributions

The distribution of one variable given a specific value of the other:

  • Discrete case:
  • Continuous case:

Independence

Random variables and are independent iff:

  • Discrete case:
  • Continuous case:

Representation

Joint distributions can be represented in various ways:

  • For discrete variables: probability mass tables or matrices
  • For continuous variables: joint density functions or contour plots
  • Copulas: functions that describe the dependence structure between variables

Types

For Discrete Variables

For joint probability mass function, if are independent, .

Cumulative probability:

For marginal probability of , .

For Continuous Variables

Suppose is the joint probability density function. The joint probability for any region lying in x-y plane is:

The cumulative distribution function,

For marginal probability density function of ,