Probability

Let us consider an experiment in the pack of 52 playing cards. Each drawing of a card is called an event.

It will be 52 events if we draw one card from a pack of 52 cards at a time. Those events are designated as E1, E2,….E52.

Second examples of events throwing a dice and tossing a coin. And the outcome of a single result of an experiment is called an event.

Random Experiment: Each trial experiment is performed under identified conditions; the outcome is not always the same, but any possible outcome than the experiment is called a random experiment.

Random experiment as drawing a card from a well-shuffled pack of cards and tossing a coin.

Sample space: The sample (S) is a set of all possible outcomes in a random experiment. Consider the following example;

  • Tossing a fair coin, we have,
\[\displaystyle S=\left\{ {H,T} \right\}i.e.\left\{ {head,tail} \right\}\]
  • Throw of dice, we get,
\[\displaystyle S=\left\{ {1,2,3,4,5,6} \right\}\]
  • When two coins are tossed together, the possible outcome,
\[\displaystyle S=\left\{ {HH,HT,TH,TT} \right\}\]
  • Together two dice are thrown,
\[\displaystyle \begin{array}{l}S=\{(1,1),(1,2),(1,3),(1,4),(1,5),(1,6)\\(2,1),(2,2),(2,3),(2,4),(2,5),(2,6),\\(3,1),(3,2),(3,3),(3,4),(3,5),(3,6),\\(4,1),(4,2),(4,3),(4,4),(4,5),(4,6),\\(5,1),(5,2),(5,3),(5,4),(5,5),(5,6),\\(6,1),(6,2),(6,3),(6,4),(6,5),(6,6)\}\end{array}\]

Simple event: Each outcome of an experiment is called a simple event.

Event: Any combination of simple events is known as an event. Event denoted as upper case letter E.

For example, when die is thrown, then each one of 1,2,3,4,5,6 is a simple event, and getting a prime number is an event: E ={ 2,3,5}

Mutually Exclusive event: Mutually Exclusive event is known as a set of events. If the happening of one event excludes the happening of the other. So, E1 and E2 are mutually exclusive if,

If E1 Ո E2 =Փ, here Փ = empty set

If E1 Ո E2 =Փ, then E1 and E2 are called compatible events.

Consider throwing a dice; we have,

S = {1,2,3,4,5,6}

Suppose E1 be the event of getting a number less than 3; clearly, E1 = {1,2} E2 be the event of getting a number greater than 4. So, E2 = {5, 6} ; so, E1 Ո E2

Exhaustive events: The events E1, E2,….., Ek that E1 Ս E2 Ս….Ս Ek = S, are called Exhaustive events.

Equally likely events: If none of them is expected to occur in performance, the other is said to be equally likely events.

Drawing a card from a pack of well-shuffled cards results in 52 equally likely events.

Probability: Random experiment let S be the sample space and E be a subset of S, i.e. E  S. Then, E is an event, so,

\[\displaystyle P(E)=\frac{{\text{Number of distinct element in E}}}{{\text{Number of distinct element in S}}}=\frac{{n(E)}}{{n(S)}}---(1)\]
\[\displaystyle =\frac{{\text{Number of out comes favourable to E}}}{{\text{Number of all possible outcomes}}}---(2)\]

odds in favour of an event and odds against it: if m is the number of ways an event can occur and n is the number of ways in which it does not occur, then,

\[\displaystyle (i)\text{odds in favour pf the event = }\frac{m}{n}---(3)\]
\[\displaystyle (ii)\text{odds against the event = }\frac{n}{m}---(4)\]

Complementry Event: Suppose S is the sample space and E Then, E is an event. Also, Ec  S. So, Ec is an event called complementary event, and E or E denotes it.

Additional Theorem

\[\displaystyle P\left( {{{E}_{1}}\bigcup {{E}_{2}}} \right)=P\left( {{{E}_{1}}} \right)+P\left( {{{E}_{2}}} \right)-P\left( {{{E}_{1}}\bigcap {{E}_{2}}} \right)---(5)\]

If E1 Ո E2 = Փ (i.e. an empty set), then

\[\displaystyle P\left( {{{E}_{1}}\bigcup {{E}_{2}}} \right)=P\left( {{{E}_{1}}} \right)+P\left( {{{E}_{2}}} \right)---(6)\]

Independent events: if two events are said to be independent, one does not depend on the other. For example, tossed are two coins.

Let E1 be the event of getting a head on the 1st coin and E2 be the event of getting a head on the 2nd coin, not depend on the occurrence of a head on the 1st coin. So, E1 and E2 are independent events.

Multiplication Theorem: If E1 and E2 are independent events, then,

\[\displaystyle P\left( {{{E}_{1}}\bigcap {{E}_{2}}} \right)=P\left( {{{E}_{1}}} \right).P\left( {{{E}_{2}}} \right)---(7)\]

Conditional probability: when an event E2 has already occurred and the probability of the occurrence of an event E1 is called conditional probability, P (E1/E2). It can be shown that,

\[\displaystyle (i)P\left( {\frac{{{{E}_{1}}}}{{{{E}_{2}}}}} \right)=\frac{{P\left( {{{E}_{1}}\bigcap {{E}_{2}}} \right)}}{{P\left( {{{E}_{2}}} \right)}}---(8)\]
\[\displaystyle (ii)P\left( {\frac{{{{E}_{2}}}}{{{{E}_{1}}}}} \right)=\frac{{P\left( {{{E}_{1}}\bigcap {{E}_{2}}} \right)}}{{P\left( {{{E}_{1}}} \right)}}----(9)\]

Binominal Theorem of probability: suppose n independent trials of an experiment with P as the probability of failure. Then,

\[\displaystyle {{P}_{{(rsuccesses)}}}={}^{n}{{c}_{r}}{{p}^{r}}{{q}^{{n-r}}}---(10)\]

Mathematical expectation: x is a random varible having values x1, x2, x3,……xn with corresponding probability P1, P2, P3,…..Pn, then the mathematical expectation of x is defined as,

\[\displaystyle E(x)={{x}_{1}}{{P}_{1}}+{{x}_{2}}{{P}_{2}}+......+{{x}_{n}}{{P}_{n}}\]
\[\displaystyle =\sum\limits_{{i=1}}^{n}{{{{x}_{i}}{{P}_{i}}----(11)}}\]
Share post on

About the author

Bhoomika Sheladiya

BSc. (CHEMISTRY) 2014- Gujarat University
MSc. (PHYSICAL CHEMISTRY) 2016 - School of Science, Gujarat University

Junior Research Fellow (JRF)- 2019
AD_HOC Assistant Professor-(July 2016 to November 2021)

View all posts

Leave a Reply

Your email address will not be published. Required fields are marked *