Bernoulli Distribution

The fundamental building block of discrete probability, modeling a single trial with two outcomes.

The "Single Coin Flip"

The Bernoulli distribution is the simplest of all discrete distributions. It models a single event or trial that has only two possible outcomes: a "success" or a "failure".

Think of it as a single coin flip (Heads or Tails), a single trade (Win or Loss), or a single bond (Default or No Default). The entire distribution is described by a single parameter, pp, which is the probability of success.

Interactive Bernoulli Trial
Adjust the probability of success (pp) to see how it affects the outcome probabilities.
Mean (μ\mu): 0.70
Variance (σ2\sigma^2): 0.21

Core Concepts

Probability Mass Function (PMF)
The PMF gives the probability that the random variable XX is exactly equal to some value kk. For a Bernoulli trial, kk can only be 0 (failure) or 1 (success).
P(X=k)=pk(1p)1kfor k{0,1}P(X=k) = p^k (1-p)^{1-k} \quad \text{for } k \in \{0, 1\}
  • If k=1k=1 (success), the formula becomes P(X=1)=p1(1p)11=p1=pP(X=1) = p^1(1-p)^{1-1} = p \cdot 1 = p.
  • If k=0k=0 (failure), the formula becomes P(X=0)=p0(1p)10=1(1p)=1pP(X=0) = p^0(1-p)^{1-0} = 1 \cdot (1-p) = 1-p.

The interactive chart above is a direct visualization of this PMF.

Cumulative Distribution Function (CDF)
The CDF gives the probability that the random variable XX is less than or equal to some value xx. It's a running total of the PMF.
F(x)=P(Xx)F(x) = P(X \le x)
Value of xxCDF: F(x)=P(Xx)F(x) = P(X \le x)Explanation
x<0x < 000The outcome cannot be less than 0.
0x<10 \le x < 11p1-pThe only possible value in this range is 0.
x1x \ge 111Includes both outcomes 0 and 1, so probability is (1p)+p=1(1-p) + p = 1.

Key Derivations

Deriving the Mean & Variance

Deriving the Expected Value (Mean)

The expected value is the sum of each outcome multiplied by its probability.

Step 1: Set up the Summation

The formula for the expected value of a discrete random variable is:

E[X]=kkP(X=k)E[X] = \sum_{k} k \cdot P(X=k)

Step 2: Apply to the Bernoulli Case

For the Bernoulli distribution, our outcomes (kk) are 0 (failure) and 1 (success).

E[X]=(0P(X=0))+(1P(X=1))E[X] = (0 \cdot P(X=0)) + (1 \cdot P(X=1))
E[X]=(0(1p))+(1p)E[X] = (0 \cdot (1-p)) + (1 \cdot p)
E[X]=0+pE[X] = 0 + p

Step 3: Final Result

Final Mean Formula
E[X]=pE[X] = p

This makes intuitive sense: if a trade has a 70% (p=0.7p=0.7) chance of success, the expected outcome of a single trial is 0.7.

Deriving the Variance

We use the formula Var(X)=E[X2](E[X])2Var(X) = E[X^2] - (E[X])^2. We already know E[X]=pE[X] = p, so we first need to find E[X2]E[X^2].

Step 1: Find the Second Moment, E[X²]

We use the same summation logic, but with k2k^2.

E[X2]=kk2P(X=k)E[X^2] = \sum_{k} k^2 \cdot P(X=k)
E[X2]=(02P(X=0))+(12P(X=1))E[X^2] = (0^2 \cdot P(X=0)) + (1^2 \cdot P(X=1))
E[X2]=(0(1p))+(1p)=pE[X^2] = (0 \cdot (1-p)) + (1 \cdot p) = p

Step 2: Calculate the Variance

Now, we substitute everything back into the variance formula:

Var(X)=E[X2](E[X])2=pp2Var(X) = E[X^2] - (E[X])^2 = p - p^2
Final Variance Formula
Var(X)=p(1p)Var(X) = p(1-p)

Notice that the variance is maximized when p=0.5p=0.5 (a 50/50 coin flip has the highest uncertainty) and is 0 when p=0p=0 or p=1p=1 (the outcome is certain).