# Binomial Distribution

One of the most important probability distributions at this level is the Binomial distribution, which is the subject of this article.

Binomial distributions occur in relation to those experiments that are binary in nature, i.e., whose outcomes can be grouped into two classes, say, success and failure, or, say, 1 and 0. For example, when you toss a coin, there are only two outcomes possible: Heads (which you may call success) and Tails (which then becomes Failure). Note that an experiment need not have only two outcomes for it to be called binary. For example, if you consider the experiment of rolling a die and make the following definitions.

Success : Numbers 1, 2, or 3

Failure : Numbers 4, 5 and 6

then, with respect to this definition, the experiment is binary. Thus, an experiment needs to have two **classes** of outcomes for it to be called binary. From now onwards, we consider the general experiment with two outcomes: success and failure, such that

*P*(Success) = *s*

*P*(Failure) = *f*

Every time such an experiment is repeated, we say that a trial of the experiment has been performed. Note that the outcome of any trial is independent of the outcome of any other trial, because the probabilities of success and failure for each trial are fixed. Such trials are also given the name Bernoulli trials.

Let us then formally state the conditions that Bernoulli trials should satisfy:

(1) There should be a finite number of trials

(2) The trials should be mutually independent

(3) Each trial should have exactly two outcomes; call them success and failure. Their probabilities for every trial should be the same.

As an example, consider the experiment of tossing a fair coin 10 times, with Heads being termed success on each toss. Thus, this experiment consists of 10 Bernoulli trials such that

\(\begin{align}& s=P\left( Success \right)\text{ }=P\left( Heads \right)\text{ }=\frac{1}{2} \\ & f=1-s=\frac{1}{2} \\ \end{align}\)

We’ll now try to understand how the word “binomial” comes up in relation to Bernoulli trials.

Consider a sequence of *n* Bernoulli trials with probabilities of success and failure on each trial being *s* and *f* respectively. We’ll now pose some question that will amply justify the word ‘Binomial’:

• What is the probability of *n* success? : This is the probability that we obtain success on every trial, i.e.,

*P*(*n* successes) = *s* × *s* × *s* × ....×*s* (*n* times)

= *s ^{n}*

• What is the probability of (*n* – 1) successes? : This is the probability that there should be a failure on any one trial, with the rest being successes. The failure can be on any one trial in ^{n}C_{1} ways, so that

\[\begin{align}& P\left( \left( \text{ n} -1 \right)\text{successes} \right) \\ &\quad\qquad ={{\ }^{n}}{{C}_{1}}\times f\times \{s\times s\times ....s\,\,\,\,\,\,\,\,(n-1)\text{times}\} \\ & \quad\qquad={{\ }^{n}}{{C}_{1}}\ {{s}^{n-1}}f \\ \end{align}\]

• What is the probability: of (*n* – 2) successes?: This is the probability that there should be failures on any two trials, which can happen in ^{n}C_{2}_{ }ways, and the rest should be successes, so that

\[\begin{align}& P\left( \left( \text{n}-2 \right)\text{ }successes \right) \\ &\qquad\quad ={{\ }^{n}}{{C}_{2}}\ \times f\times f\times \{s\times s\times ...s\,\,\,\,\,\,\,(n-2)\ \text{times}\} \\ & \qquad\quad={{\ }^{n}}{{C}_{2}}\ {{s}^{n-2}}{{f}^{2}} \\ \end{align}\]

Continuining in this way, we see that the probability of *r* successes in *n* Bernoulli trials is

\[P\left( \text{r successes} \right)={{\ }^{n}}{{C}_{r}}\ {{s}^{r}}\ {{f}^{n-r}}\]

which is actually the *r*^{th} term in the Binomial expansion of (*f* + *s*)^{n}. This is the reason for the distribution being called Binomial

\(\text{No.of successes}\) | \(\text{0}\) | \(\text{1}\) | \(\text{2}\) | \(\text{n}\) | |

\(\text{P(No.of successes)}\) | \(^{n}{{C}_{0}}\,{{f}^{n}}\) | \(^{n}{{C}_{0}}\,{{f}^{n-1}}{{s}^{1}} \) | \(^{n}{{C}_{2}}\,{{f}^{n-2}}{{s}^{2}}\) | ..... | \(^{n}{{C}_{n}}\,{{s}^{n}} \) |

For example, in case of 3 Bernoulli trials, we have

\(\text{No.of successes}\) | \(\text{0}\) | \(\text{1}\) | \(\text{2}\) | \(\text{3}\) |

\(\text{P(No.of successes)}\) | \({{f}^{3}}\) | \(3{{f}^{2}}s\) | \(3f{{s}^{2}}\) | \({{s}^{3}}\) |

**Example – 21**

A man takes a step forward with probability 0.4 and backward with probability 0.6. What is the probability that at the end of 11 steps, he will be one step away from the starting point?

**Solution:** Visualise the situation. To be just one step away from the starting point after 11 steps, there are only two cases possible

**(a)** he has taken 6 steps forward and 5 backward in some order

OR

**(b)** he has taken 5 steps forward and 6 backward in some order.

If we let a step forward denote success and a step backward denote failure, we have

*s* = 0.4 , *f* = 0.6

so that,

\[\begin{align}&\; P\left\{ \begin{gathered} \text{one step away} \\ \text{after 11 steps} \\

\end{gathered} \right\}=P\left\{ \begin{gathered} & \text{6 success,} \\ & \text{5 failures} \\

\end{gathered} \right\}+P\left\{ \begin{gathered} & \text{5 success,} \\ & \text{6 failures} \\

\end{gathered} \right\} \\\\ &\qquad\qquad \qquad\qquad\quad={{\,}^{11}}{{C}_{6}}\ {{s}^{6}}{{f}^{5}}+{{\ }^{11}}{{C}_{5}}\ {{s}^{5}}{{f}^{6}} \\\\ &\qquad\qquad \qquad\qquad\quad={{\,}^{11}}{{C}_{6}}\ {{s}^{5}}{{f}^{5}}(s+f) \\\\ &\qquad\qquad \qquad\qquad\quad =\frac{11!}{6!\ \ 5!}{{(0.4)}^{5}}{{(0.6)}^{5}} \\\\ & \qquad\qquad \qquad\qquad\quad=0.37 \\ \end{align}\]

**Example – 22**

*X* and *Y* are playing a tournament, consisting of matches. The first one to win (*n +* 1) matches wins. The probabilities of their winning a match are *p* and *q* respectively. Find their respective probabilities of winning the tournament.

**Solution:** To make things easier to understand let us write down some cases explicitly. We will calculate the probability of *X* winning the tournament:

Case in which X wins the tournament |
P(Case) |

\(\bullet \quad \text{X wins the first (n + 1) matches straight}\) \(\bullet \quad \text{X wins n matches out of the first (n + 1) and then the (n + 2}{{\text{)}}^{\text{nd}}}\text{ match} \) \(\bullet \quad\text{X wins n matches out of the first (n + 2), and then wins the (n + 3}{{\text{)}}^{\text{rd}}}\text{ match}\) \(\vdots \) \(\bullet \quad \text{X wins n matches out of the first (n + r) and then wins the (n + r + 1}{{\text{)}}^{\text{st}}}\) \(\vdots \) \(\bullet \quad \text{X wins n matches out of the first 2n, and then wins the (2n + 1}{{\text{)}}^{\text{th }}}\text{match}\) |
\({{p}^{n+1}}\) \(\underbrace{\left( ^{n+1}{{C}_{n}}\ {{p}^{n}}q \right)}_{\text{first}\ (n+1)\ \text{matches}}\cdot \underset{{{(n+2)}^{nd}}\ \text{match}}{\mathop{p}}\,\) \(\underbrace{\left( ^{n+2}{{C}_{n}}\ {{p}^{n}}{{q}^{2}} \right)}_{(n+2)\ \text{matches}}\cdot \underset{{{(n+3)}^{rd}}\ \text{match}}{\mathop{p}} \) \(\underbrace{\left( ^{n+r}{{C}_{n}}\ {{p}^{n}}{{q}^{r}} \right)}_{(n+r)\ \text{matches}}\cdot \underset{{{(n+r+1)}^{st}}\ \text{match}}{\mathop{p}}\,\) \(\underbrace{\left( ^{n+n}{{C}_{n}}\ {{p}^{n}}{{q}^{n}} \right)}_{n+n\ \text{matches}}\cdot \underset{{{(2n+1)}^{st}}\ \text{match}}{\mathop{p}}\,\) |

Thus,

\[P\left\{ X\text{ wins the tournament} \right\}=\sum\limits_{r\ =\ 0}^{n}{^{n+r}{{C}_{n}}\ {{p}^{n+1}}}{{q}^{r}}\]

Similarly,

\[P\left\{ Y\text{ wins the tournament} \right\}=\sum\limits_{r\ =\ 0}^{n}{^{n+r}{{C}_{n}}\ {{q}^{n+1}}}{{p}^{r}}\]

**TRY YOURSELF - IV**

**Q. 1** A random variable *X* has the following *PD*:

X |
0 |
1 |
2 |
3 |
4 |
5 |
6 |
7 |

\(P(X)\) | \(0\) | \(k\) | \(2k\) | \(2k\) | \(3k\) | \({{k}^{2}}\) | \(2{{k}^{2}}\) | \(7{{k}^{2}}+k\) |

**(a) **Find *k*

**(b)** Find *P*(*X* < 3)

**(c) **Find *P*(*X* > 6)

**Q. 2** Show by explicit calculation that the mean number of heads in five tosses of a fair coin is 2.5.

**Q. 3** In a sequence of 4 Bernoulli trials, we have \(\begin{align}s=\frac{1}{3},\ f=\frac{2}{3}\end{align}\) on every trial. What is the expected number of successes? Can you generalise this answer?

**Q. 4** Ten eggs are drawn successively, **with replacement** (i.e. each egg is put back after being drawn) from a lot containing 10% defective eggs. Find the probability that there was at least one defective egg in those ten eggs

**Q. 5** For a coin, Heads is 3 times more likely to occur than Tails. This coin is tossed twice. Find the *PD* of the number of Tails

**Q. 6** A person takes a step forward with probability 0.45 and backward with probability 0.55. What is the probability that after 13 steps, he is 3 steps away from the starting point?

**Q. 7** A die is thrown 7 times. What is the probability that an odd number turns up exactly 4 times?

**Q. 8** A coin is tossed 21 times. If the probability of getting more number of heads than the number of tails, is the same as the probability of getting more tails than heads, then prove that the coin must be unbiased.

**Q. 9** An unbiased die is rolled 6 times. Let *E* be the event that the number obtained on a roll is neither 1 nor 6. Find the probability that.

**(a)** *E* occurs exactly 4 times out of the 6 trials

**(b)** *E* occurs at least once

**Q. 10** The probability of a shooter hitting a target is \(\begin{align}\frac{2}{3}\end{align}\) and at least 2 direct hits are required to win the competition. Find the least number of shots required so that the probability that the shooter wins is greater than 7/9.