A r.v. $X$ has the **Bernoulli distribution** with a probability $0<p<1$ if $P(X=1)=p$ and $P(X=0)=1−p$.

- Note that $p$ cannot be exactly $0$ or $1$, so both of these probabilities are positive.
- We say that $X$ has the Bernoulli distribution. The textbook writes it as $X∼Bern(p)$.

## Bernoulli trial

An experiment represented by an Indicator variable $I_{A}$ for some event $A$ that we’re testing.

- $I_{A}=1$ if $A$ occurs, and equals 0 if $A$ doesn’t occur.
- If $0<P(A)<1$, then $I_{A}∼Bern(p)$, where $p=P(A)$.

## PMF

Since $X$ can only take on a value of $0$ or $1$, the PMF of $X$ can be given, for $k∈{0,1}$, by

$P(X=k)={p1−p ifk=1ifk=0 $## variance

If $X$ is an indicator variable with $P(X=1)=p$, then its Variance $Var(X)=p(1−p)$.

We can prove this by showing $E(X_{2})=p$, because the support ${0_{2},1_{2}}={0,1}$ and so $E(X)=E(X_{2})$. We have $Var(X)=E(X_{2})−E(X)_{2}=p−p_{2}$.

The largest possible variance of an indicator r.v. is $Var(X)=0.25$, if $p=0.5$.

## MGF

Use LOTUS to calculate the MGF. We have

$M_{X}(t)=E(e_{tX})=e_{0⋅t}(1−p)+e_{1⋅t}p=e_{t}p+(1−p)$for any $t∈R$.