If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

AP®︎/College Statistics

Course: ap®︎/college statistics   >   unit 8.

  • Quiz 1 Random variables and probability distributions

Lesson 2: Properties of Probability

In this lesson, we learn the fundamental concepts of probability. It is this lesson that will allow us to start putting our first tools into our new probability toolbox.

  • Learn why an understanding of probability is so critically important to the advancement of most kinds of scientific research.
  • Learn the definition of an event.
  • Learn how to derive new events by taking subsets, unions, intersections, and/or complements of already existing events.
  • Learn the definitions of specific kinds of events, namely empty events, mutually exclusive (or disjoint) events, and exhaustive events.
  • Learn the formal definition of probability.
  • Learn three ways — the person opinion approach, the relative frequency approach, and the classical approach — of assigning a probability to an event.
  • Learn five fundamental theorems, which when applied, allow us to determine probabilities of various events.
  • Get lots of practice calculating probabilities of various events.

2.1 - Why Probability?

In the previous lesson, we discussed the big picture of the course without really delving into why the study of probability is so vitally important to the advancement of science. Let's do that now by looking at two examples.

Example 2-1

Suppose that the Penn State Committee for the Fun of Students claims that the average number of concerts attended yearly by Penn State students is 2. Then, suppose that we take a random sample of 50 Penn State students and determine that the average number of concerts attended by the 50 students is:

\(\dfrac{1+4+3+\ldots+2}{50}=3.2\)

that is, 3.2 concerts per year. That then begs the question: if the actual population average is 2, how likely is it that we'd get a sample average as large as 3.2?

What do you think? Is it likely or not likely? If the answer to the question is ultimately "not likely", then we have two possible conclusions:

  • Either: The true population average is indeed 2. We just happened to select a strange and unusual sample.
  • Or: Our original claim of 2 is wrong. Reject the claim, and conclude that the true population average is more than 2.

Of course, I don't raise this example simply to draw conclusions about the frequency with which Penn State students attend concerts. Instead I raise it to illustrate that in order to use a random sample to draw a conclusion about a population, we need to be able to answer the question "how likely...?", that is "what is the probability...?". Let's take a look at another example.

Example 2-2

Suppose that the Penn State Parking Office claims that two-thirds (67%) of Penn State students maintain a car in State College. Then, suppose we take a random sample of 100 Penn State students and determine that the proportion of students in the sample who maintain a car in State College is:

\(\dfrac{69}{100}=0.69\)

that is, 69%. Now we need to ask the question: if the actual population proportion is 0.67, how likely is it that we'd get a sample proportion of 0.69?

What do you think? Is it likely or not likely? If the answer to the question is ultimately "likely," then we have just one possible conclusion: The Parking Office's claim is reasonable. Do not reject their claim.

Again, I don't raise this example simply to draw conclusions about the driving behaviors of Penn State students. Instead I raise it to illustrate again that in order to use a random sample to draw a conclusion about a population, we need to be able to answer the question "how likely...?", that is "what is the probability...?".

So, in summary, why do we need to learn about probability? Any time we want to answer a research question that involves using a sample to draw a conclusion about some larger population, we need to answer the question "how likely is it...?" or "what is the probability...?". To answer such a question, we need to understand probability , probability rules , and probability models . And that's exactly what we'll be working on learning throughout this course.

Now that we've got the motivation for this probability course behind us, let's delve right in and start filling up our probability tool box!

2.2 - Events

Recall that given a random experiment, then the outcome space (or sample space) \(\mathbf{S}\) is the collection of all possible outcomes of the random experiment.

Example 2-3

Suppose we randomly select a student, and ask them "how many pairs of jeans do you own?". In this case our sample space \(\mathbf{S}\) is:

\(\mathbf{S} = \{0, 1, 2, 3, ...\}\)

We could theoretically put some realistic upper limit on that sample space, but who knows what it would be? So, let's leave it as accurate as possible. Now let's define some events.

If \(A\) is the event that a randomly selected student owns no jeans:

\(A\) = student owns none = \(\{0\}\)

If \(B\) is the event that a randomly selected student owns some jeans:

\(B\) = student owns some = \(\{1, 2, 3, ...\}\)

If \(C\) is the event that a randomly selected student owns no more than five pairs of jeans:

\(C\) = student owns no more than five pairs = \(\{0, 1, 2, 3, 4, 5\}\)

And, if \(D\) is the event that a randomly selected student owns an odd number of pairs of jeans:

\(D\) = student owns an odd number = \(\{1, 3, 5, ...\}\)

Since events and sample spaces are just sets, let's review the algebra of sets:

  • \(\emptyset\) is the " null set " (or " empty set ")
  • \(C\cup D\) = " union " = the elements in \(C\) or \(D\) or both
  • \(A\cap B\) = " intersection " = the elements in \(A\) and \(B\). If (A\cap B=\emptyset\), then \(A\) and \(B\) are called " mutually exclusive events " (or " disjoint events ").
  • \(D^\prime=D^c\)= " complement" = the elements not in \(D\)
  • If \(E\cup F\cup G...=\mathbf{S}\), then \(E, F, G\), and so on are called " exhaustive events ."

Example 2-3 Continued

Let's revisit the previous "how many pairs of jeans do you own?" example. That is, suppose we randomly select a student, and ask them "how many pairs of jeans do you own?". In this case our sample space S is:

Now, let's define some composite events.

The union of events \(C\) and \(D\) is the event that a randomly selected student either owns no more than five pairs or owns an odd number. That is:

\(C\cup D=\{0, 1, 2, 3, 4, 5, 7, 9, ...\}\)

The intersection of events \(A\) and \(B\) is the event that a randomly selected student owes no pairs and owes some pairs of jeans. That is:

\(A\cap B = \{0\} \cap \{1, 2, 3, ...\}\) = the empty set \(\emptyset\)

The complement of event \(D\) is the event that a randomly selected student owes an even number of pairs of jeans. That is:

\(D^\prime= \{0, 2, 4, 6, ...\}\)

If \(E = \{0, 1\}\), \(F = \{2, 3\}\), \(G = \{4, 5\}\) and so on, so that:

\(E\cup F\cup G\cup ...=\mathbf{S}\)

then \(E, F, G\), and so on are exhaustive events.

2.3 - What is Probability (Informally)?

We'll get to the more formal definition of probability soon, but let's think about probability just informally for a moment. How about this as an informal definition?

2.4 - How to Assign Probability to Events

We know that probability is a number between 0 and 1. How does an event get assigned a particular probability value? Well, there are three ways of doing so:

  • the personal opinion approach
  • the relative frequency approach
  • the classical approach

On this page, we'll take a look at each approach.

The Personal Opinion Approach

This approach is the simplest in practice, but therefore it also the least reliable. You might think of it as the "whatever it is to you" approach. Here are some examples:

  • "I think there is an 80% chance of rain today."
  • "I think there is a 50% chance that the world's oil reserves will be depleted by the year 2100."
  • "I think there is a 1% chance that the men's basketball team will end up in the Final Four sometime this decade."

Example 2-4

At which end of the probability scale would you put the probability that:

  • one day you will die?
  • you can swim around the world in 30 hours?
  • you will win the lottery someday?
  • a randomly selected student will get an A in this course?
  • you will get an A in this course?

The Relative Frequency Approach

The relative frequency approach involves taking the follow three steps in order to determine P ( A ), the probability of an event A :

  • Perform an experiment a large number of times, n , say.
  • Count the number of times the event A of interest occurs, call the number N ( A ), say.
  • Then, the probability of event A equals:

\(P(A)=\dfrac{N(A)}{n}\)

The relative frequency approach is useful when the classical approach that is described next can't be used.

Example 2-5

Penny

When you toss a fair coin with one side designated as a "head" and the other side designated as a "tail", what is the probability of getting a head?

I think you all might instinctively reply \(\dfrac{1}{2}\). Of course, right? Well, there are three people who once felt compelled to determine the probability of getting a head using the relative frequency approach:

As you can see, the relative frequency approach yields a pretty good approximation to the 0.50 probability that we would all expect of a fair coin. Perhaps this example also illustrates the large number of times an experiment has to be conducted in order to get reliable results when using the relative frequency approach.

By the way, Count Buffon (1707-1788) was a French naturalist and mathematician who often pondered interesting probability problems. His most famous question

Suppose we have a floor made of parallel strips of wood, each the same width, and we drop a needle onto the floor. What is the probability that the needle will lie across a line between two strips?

came to be known as Buffon's needle problem. Karl Pearson (1857-1936) effectively established the field of mathematical statistics. And, once you hear John Kerrich's story, you might understand why he, of all people, carried out such a mind-numbing experiment. He was an English mathematician who was lecturing at the University of Copenhagen when World War II broke out. He was arrested by the Germans and spent the war interned in a prison camp in Denmark. To help pass the time he performed a number of probability experiments, such as this coin-tossing one.

Example 2-6

trees

Some trees in a forest were showing signs of disease. A random sample of 200 trees of various sizes was examined yielding the following results:

What is the probability that one tree selected at random is large?

There are 68 large trees out of 200 total trees, so the relative frequency approach would tell us that the probability that a tree selected at random is large is 68/200 = 0.34.

What is the probability that one tree selected at random is diseased?

There are 37 diseased trees out of 200 total trees, so the relative frequency approach would tell us that the probability that a tree selected at random is diseased is 37/200 = 0.185.

What is the probability that one tree selected at random is both small and diseased?

There are 8 small, diseased trees out of 200 total trees, so the relative frequency approach would tell us that the probability that a tree selected at random is small and diseased is 8/200 = 0.04.

What is the probability that one tree selected at random is either small or disease-free?

There are 121 trees (35 + 46 + 24 + 8 + 8) out of 200 total trees that are either small or disease-free, so the relative frequency approach would tell us that the probability that a tree selected at random is either small or disease-free is 121/200 = 0.605.

What is the probability that one tree selected at random from the population of medium trees is doubtful of disease?

There are 92 medium trees in the sample. Of those 92 medium trees, 32 have been identified as being doubtful of disease. Therefore, the relative frequency approach would tell us that the probability that a medium tree selected at random is doubtful of disease is 32/92 = 0.348.

The Classical Approach

The classical approach is the method that we will investigate quite extensively in the next lesson. As long as the outcomes in the sample space are equally likely (!!!), the probability of event \(A\) is:

\(P(A)=\dfrac{N(A)}{N(\mathbf{S})}\)

where \(N(A)\) is the number of elements in the event \(A\), and \(N(\mathbf{S})\) is the number of elements in the sample space \(\mathbf{S}\). Let's take a look at an example.

Example 2-7

Suppose you draw one card at random from a standard deck of 52 cards. Recall that a standard deck of cards contains 13 face values (Ace, 2, 3, 4, 5, 6, 7, 8, 9, 10, Jack, Queen, and King) in 4 different suits (Clubs, Diamonds, Hearts, and Spades) for a total of 52 cards. Assume the cards were manufactured to ensure that each outcome is equally likely with a probability of 1/52. Let \(A\) be the event that the card drawn is a 2, 3, or 7. Let \(B\) be the event that the card is a 2 of hearts (H), 3 of diamonds (D), 8 of spades (S) or king of clubs (C). That is:

  • \(A= \{x: x \text{ is a }2, 3,\text{ or }7\}\)
  • \(B = \{x: x\text{ is 2H, 3D, 8S, or KC}\}\)
  • What is the probability that a 2, 3, or 7 is drawn?
  • What is the probability that the card is a 2 of hearts, 3 of diamonds, 8 of spades or king of clubs?
  • What is the probability that the card is either a 2, 3, or 7 or a 2 of hearts, 3 of diamonds, 8 of spades or king of clubs?
  • What is \(P(A\cap B)\)?

2.5 - What is Probability (Formally)?

Previously, we defined probability informally. Now, let's take a look at a formal definition using the “ axioms of probability .”

Probability is a (real-valued) set function \(P\) that assigns to each event \(A\) in the sample space \(\mathbf{S}\) a number \(P(A)\), called the probability of the event \(A\), such that the following hold:

  • The probability of any event \(A\) must be nonnegative, that is, \(P(A)\ge 0\).
  • The probability of the sample space is 1, that is, \(P(\mathbf{S})=1\).

\(P(A_1\cup A_2 \cup \cdots \cup A_k)=P(A_1)+P(A_2)+\cdots+P(A_k)\)

\(P(A_1\cup A_2 \cup \cdots )=P(A_1)+P(A_2)+\cdots \)

  • the probability of a finite union of the events is the sum of the probabilities of the individual events, that is:
  • the probability of a countably infinite union of the events is the sum of the probabilities of the individual events, that is:

Example 2-8

Suppose that a Stat 414 class contains 43 students, such that 1 is a Freshman, 4 are Sophomores, 20 are Juniors, 9 are Seniors, and 9 are Graduate students:

Randomly select one student from the Stat 414 class. Defining the following events:

  • Fr = the event that a Freshman is selected
  • So = the event that a Sophomore is selected
  • Ju = the event that a Junior is selected
  • Se = the event that a Senior is selected
  • Gr = the event that a Graduate student is selected

The sample space is S = (Fr, So, Ju, Se, Gr}. Using the relative frequency approach to assigning probability to the events:

  • P (Fr) = 0.02
  • P (So) = 0.09
  • P (Ju) = 0.47
  • P (Se) = 0.21
  • P (Gr) = 0.21

Let's check to make sure that each of the three axioms of probability are satisfied.

2.6 - Five Theorems

Now, let's use the axioms of probability to derive yet more helpful probability rules. We'll work through five theorems in all, in each case first stating the theorem and then proving it. Then, once we've added the five theorems to our probability tool box, we'll close this lesson by applying the theorems to a few examples.

2.7 - Some Examples

Example 2-9.

Construction Equipment

A company has bid on two large construction projects. The company president believes that the probability of winning the first contract is 0.6, the probability of winning the second contract is 0.4, and the probability of winning both contracts is 0.2.

  • What is the probability that the company wins at least one contract?
  • What is the probability that the company wins the first contract but not the second contract?
  • What is the probability that the company wins neither contract?
  • What is the probability that the company wins exactly one contract?

Example 2-10

If it is known that \(A\subseteq B\), what can be definitively said about \(P(A\cap B)\)?

Example 2-11

If 7% of the population smokes cigars, 28% of the population smokes cigarettes, and 5% of the population smokes both, what percentage of the population smokes neither cigars nor cigarettes?

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Statistics LibreTexts

6.3: The Binomial Distribution

  • Last updated
  • Save as PDF
  • Page ID 28391

Learning Objectives

  • To learn the concept of a binomial random variable.
  • To learn how to recognize a random variable as being a binomial random variable.

The experiment of tossing a fair coin three times and the experiment of observing the genders according to birth order of the children in a randomly selected three-child family are completely different, but the random variables that count the number of heads in the coin toss and the number of boys in the family (assuming the two genders are equally likely) are the same random variable, the one with probability distribution

\[\begin{array}{c|cccc} x& 0& 1& 2& 3\\ \hline P(x)& 0.125& 0.375& 0.375& 0.125\\ \end{array} \nonumber \]

A histogram that graphically illustrates this probability distribution is given in Figure \(\PageIndex{1}\). What is common to the two experiments is that we perform three identical and independent trials of the same action, each trial has only two outcomes (heads or tails, boy or girl), and the probability of success is the same number, \(0.5\), on every trial. The random variable that is generated is called the binomial random variable with parameters \(n=3\) and \(p=0.5\). This is just one case of a general situation.

38928bb96eec1a3decfd6ac8e189a437.jpg

Definition: binomial distribution

Suppose a random experiment has the following characteristics.

  • There are \(n\) identical and independent trials of a common procedure.
  • There are exactly two possible outcomes for each trial, one termed “success” and the other “failure.”
  • The probability of success on any one trial is the same number \(p\).

Then the discrete random variable \(X\) that counts the number of successes in the n trials is the binomial random variable with parameters \(n\) and \(p\). We also say that \(X\) has a binomial distribution with parameters \(n\) and \(p\).

The following four examples illustrate the definition. Note how in every case “success” is the outcome that is counted, not the outcome that we prefer or think is better in some sense.

  • A random sample of \(125\) students is selected from a large college in which the proportion of students who are females is \(57\%\). Suppose \(X\) denotes the number of female students in the sample. In this situation there are \(n=125\) identical and independent trials of a common procedure, selecting a student at random; there are exactly two possible outcomes for each trial, “success” (what we are counting, that the student be female) and “failure;” and finally the probability of success on any one trial is the same number \(p = 0.57\). \(X\) is a binomial random variable with parameters \(n = 125\) and \(p = 0.57\).
  • A multiple-choice test has \(15\) questions, each of which has five choices. An unprepared student taking the test answers each of the questions completely randomly by choosing an arbitrary answer from the five provided. Suppose \(X\) denotes the number of answers that the student gets right. \(X\) is a binomial random variable with parameters \(n = 15\) and \(p=1/5=0.20\).
  • In a survey of \(1,000\) registered voters each voter is asked if he intends to vote for a candidate Titania Queen in the upcoming election. Suppose \(X\) denotes the number of voters in the survey who intend to vote for Titania Queen. \(X\) is a binomial random variable with \(n = 1000\) and \(p\) equal to the true proportion of voters (surveyed or not) who intend to vote for Titania Queen.
  • An experimental medication was given to \(30\) patients with a certain medical condition. Suppose \(X\) denotes the number of patients who develop severe side effects. \(X\) is a binomial random variable with \(n = 30\) and \(p\) equal to the true probability that a patient with the underlying condition will experience severe side effects if given that medication.

Probability Formula for a Binomial Random Variable

Often the most difficult aspect of working a problem that involves the binomial random variable is recognizing that the random variable in question has a binomial distribution. Once that is known, probabilities can be computed using the following formula.

If \(X\) is a binomial random variable with parameters \(n\) and \(p\), then

where \(q=1-p\) and where for any counting number \(m\), \(m!\) (read “m factorial”) is defined by

and in general

Example \(\PageIndex{1}\)

Seventeen percent of victims of financial fraud know the perpetrator of the fraud personally.

  • Use the formula to construct the probability distribution for the number \(X\) of people in a random sample of five victims of financial fraud who knew the perpetrator personally.
  • A investigator examines five cases of financial fraud every day. Find the most frequent number of cases each day in which the victim knew the perpetrator.
  • A investigator examines five cases of financial fraud every day. Find the average number of cases per day in which the victim knew the perpetrator.

The random variable X is binomial with parameters \(n = 5\) and \(p = 0.17\); \(q=1-p=0.83\). The possible values of \(X\) are \(0, 1, 2, 3, 4,\; \text{and}\; 5\).

\[\begin{align*} P(0) &= \frac{5!}{0!5!}(0.17)^0(0.83)^5\\ &= \frac{1\cdot 2\cdot 3\cdot 4\cdot 5}{(1)(1\cdot 2\cdot 3\cdot 4\cdot 5)}1\cdot (0.3939040643)\\ &= 0.3939040643\approx 0.3939 \end{align*} \nonumber \]

\[\begin{align*} P(1) &= \frac{5!}{1!4!}(0.17)^1(0.83)^4\\ &= \frac{1\cdot 2\cdot 3\cdot 4\cdot 5}{(1)(1\cdot 2\cdot 3\cdot 4)}(0.17)\cdot (0.47458321)\\ &= 5\cdot (0.17)\cdot (0.47458321)\\ &= 0.4033957285 \approx 0.4034 \end{align*} \nonumber \]

\[\begin{align*} P(2) &= \frac{5!}{2!3!}(0.17)^2(0.83)^3\\ &= \frac{1\cdot 2\cdot 3\cdot 4\cdot 5}{(1\cdot 2)(1\cdot 2\cdot 3)}(0.0289)\cdot (0.571787)\\ &= 10\cdot (0.0289)\cdot (0.571787)\\ &= 0.165246443 \approx 0.1652 \end{align*} \nonumber \]

\[\begin{array}{c|cccccc} x& 0& 1& 2& 3& 4& 5\\ \hline P(x)& 0.3939& 0.4034& 0.1652& 0.0338& 0.0035& 0.0001 \\ \end{array} \nonumber \]

The probabilities do not add up to exactly \(1\) because of rounding.

This probability distribution is represented by the histogram in Figure \(\PageIndex{2}\), which graphically illustrates just how improbable the events \(X = 4\) and \(X = 5\) are. The corresponding bar in the histogram above the number \(4\) is barely visible, if visible at all, and the bar above \(5\) is far too short to be visible.

2fcd12630df97eb0e8aebc3dfb3e7887.jpg

The value of \(X\) that is most likely is \(X = 1\), so the most frequent number of cases seen each day in which the victim knew the perpetrator is one.

The average number of cases per day in which the victim knew the perpetrator is the mean of \(X\), which is

\[\begin{align} μ&=\sum xP(x) \\ &=0⋅0.3939+1⋅0.4034+2⋅0.1652+3⋅0.0338+4⋅0.0035+5⋅0.0001 \\ &= 0.8497 \end{align} \nonumber \]

Special Formulas for the Mean and Standard Deviation of a Binomial Random Variable

Since a binomial random variable is a discrete random variable, the formulas for its mean, variance, and standard deviation given in the previous section apply to it, as we just saw in Example \(\PageIndex{2}\) in the case of the mean. However, for the binomial random variable there are much simpler formulas.

\[\mu=np \nonumber \]

\[\sigma ^2=npq \nonumber \]

\[\sigma =\sqrt{npq} \nonumber \]

where \( q=1-p\).

Example \(\PageIndex{2}\)

Find the mean and standard deviation of the random variable \(X\) of Example \(\PageIndex{1}\).

The random variable \(X\) is binomial with parameters \(n = 5\) and \(p = 0.17\), and \(q=1-p=0.83\). Thus its mean and standard deviation are

\[\mu =np=(5)\cdot (0.17)=0.85 \; \; \text{(exactly)} \nonumber \]

\[\sigma =\sqrt{npq}=\sqrt{(5)\cdot (0.17)\cdot (0.83)}=\sqrt{0.7055}\approx 0.8399 \nonumber \]

The Cumulative Probability Distribution of a Binomial Random Variable

In order to allow a broader range of more realistic problems contains probability tables for binomial random variables for various choices of the parameters \(n\) and \(p\). These tables are not the probability distributions that we have seen so far, but are cumulative probability distributions . In the place of the probability \(P(x)\) the table contains the probability

\[P(X≤x) = P(0) + P(1) + \ldots + P(x) \nonumber \]

This is illustrated in Figure \(\PageIndex{3}\). The probability entered in the table corresponds to the area of the shaded region. The reason for providing a cumulative table is that in practical problems that involve a binomial random variable typically the probability that is sought is of the form \( P(X≤x)\) or \( P(X≥x)\). The cumulative table is much easier to use for computing \( P(X≤x)\) since all the individual probabilities have already been computed and added. The one table suffices for both \( P(X≤x)\) or \( P(X≥x)\) and can be used to readily obtain probabilities of the form \( P(x)\), too, because of the following formulas. The first is just the Probability Rule for Complements.

a939536e3189562fdfb0f9323e6c3e68.jpg

If \(X\) is a discrete random variable, then

\[P(X≥x)=1−P(X≤x−1) \nonumber \]

 and

\[P(x)=P(X≤x)−P(X≤x−1) \nonumber \]

Example \(\PageIndex{3}\)

A student takes a ten-question true/false exam.

  • Find the probability that the student gets exactly six of the questions right simply by guessing the answer on every question.
  • Find the probability that the student will obtain a passing grade of \(60\%\) or greater simply by guessing.

Let \(X\) denote the number of questions that the student guesses correctly. Then \(X\) is a binomial random variable with parameters \(n = 10\) and \(p= 0.50\).

  • The probability sought is \(P(6)\). The formula gives \[P(6)=10!(6!)(4!)(.5)6.54=0.205078125\nonumber \] Using the table, \[P(6)=P(X≤6)−P(X≤5)=0.8281−0.6230=0.2051\nonumber \]
  • The student must guess correctly on at least \(60\%\) of the questions, which is \((0.60)\cdot (10)=6\) questions. The probability sought is not \(P(6)\) (an easy mistake to make), but \[P(X≥6)=P(6)+P(7)+P(8)+P(9)+P(10)\nonumber \] Instead of computing each of these five numbers using the formula and adding them we can use the table to obtain \[P(X≥6)=1−P(X≤5)=1−0.6230=0.3770\nonumber \] which is much less work and of sufficient accuracy for the situation at hand.

Example \(\PageIndex{4}\)

An appliance repairman services five washing machines on site each day. One-third of the service calls require installation of a particular part.

  • The repairman has only one such part on his truck today. Find the probability that the one part will be enough today, that is, that at most one washing machine he services will require installation of this particular part.
  • Find the minimum number of such parts he should take with him each day in order that the probability that he have enough for the day's service calls is at least \(95\%\).

Let \(X\) denote the number of service calls today on which the part is required. Then \(X\) is a binomial random variable with parameters \(n = 5\) and \(p=1/3=0.\bar{3}\)

  • Note that the probability in question is not \(P(1)\), but rather \(P(X\leq 1)\). Using the cumulative distribution table, \[P(X≤1)=0.4609\nonumber \]
  • The answer is the smallest number \(x\) such that the table entry \(P(X\leq x)\) is at least \(0.9500\). Since \(P(X\leq2 )=0.7901\) is less than \(0.95\), two parts are not enough. Since \(P(X\leq 3)=0.9547 \) is as large as \(0.95\), three parts will suffice at least \(95\%\) of the time. Thus the minimum needed is three.
  • The discrete random variable \(X\) that counts the number of successes in \(n\) identical, independent trials of a procedure that always results in either of two outcomes, “success” or “failure,” and in which the probability of success on each trial is the same number \(p\), is called the binomial random variable with parameters \(n\) and \(p\).
  • There is a formula for the probability that the binomial random variable with parameters \(n\) and \(p\) will take a particular value \(x\).
  • There are special formulas for the mean, variance, and standard deviation of the binomial random variable with parameters \(n\) and \(p\) that are much simpler than the general formulas that apply to all discrete random variables.
  • Cumulative probability distribution tables, when available, facilitate computation of probabilities encountered in typical practical situations.

IMAGES

  1. Different Types of Probability Distribution (Characteristics & Examples

    properties of probability distributions assignment quizlet

  2. Probability Distribution Cheat Sheet

    properties of probability distributions assignment quizlet

  3. Probability Distribution

    properties of probability distributions assignment quizlet

  4. Probability distributions Flashcards

    properties of probability distributions assignment quizlet

  5. Student Tutorial: Mean of a Probability Distribution

    properties of probability distributions assignment quizlet

  6. Chapter 7: Random Variables and Discrete Probability Distributions

    properties of probability distributions assignment quizlet

VIDEO

  1. Probability distribution

  2. F5 Chap 8

  3. Binomial Probability Distribution+Solved Ex# 8.6 to 8.10 |Chapter#8 |Discrete Probability Distri

  4. Section 5.3

  5. Probability Function

  6. Statistics Module 6

COMMENTS

  1. (Assignment)Properties Of Probability Distribution Flashcards

    0.15. The probability distribution below is. negatively skewed. The mean is. less than the median. The mode is. greater than the median. Study with Quizlet and memorize flashcards containing terms like The fair spinner shown is spun 1 time. There are four possible outcomes, one for each color.

  2. Properties of Probability Distributions Assignment Flashcards

    What is the sum of the probabilities for all possible outcomes? 1. Using the data from the theoretical probability table, what is the probability of the spinner landing only once on yellow in two spins? C) 0.375. Which graph shows the probability distribution for the random variable representing the number of greens?

  3. Properties of Probability Distributions Flashcards

    Which of the following describes the probability distribution below? (POSITIVE SKEWED QUESTION) A.The mean is greater than the median, and the majority of the data points are to the left of the mean. B.The mean is greater than the median, and the majority of the data points are to the right of the mean. C.The median is greater than the mean ...

  4. Properties of Probability Distributions Flashcards

    Study with Quizlet and memorize flashcards containing terms like What is a random variable, X?, What is a probability distribution?, What are properties hold for all probabilities in a probability distribution? and more.

  5. Properties of Probability Distributions Flashcards

    Study with Quizlet and memorize flashcards containing terms like Two fair dice, one yellow and one blue, are rolled. The value of the blue die is subtracted from the value of the yellow die. Which of the following best describes the theoretical probability distribution?, A spinner is divided into two equal parts, one red and one blue. The set of possible outcomes when the spinner is spun twice ...

  6. Random variables and probability distributions: Quiz 1

    Quiz 1. Loading... Learn for free about math, art, computer programming, economics, physics, chemistry, biology, medicine, finance, history, and more. Khan Academy is a nonprofit with the mission of providing a free, world-class education for anyone, anywhere.

  7. Lesson 2: Properties of Probability

    Objectives. Upon completion of this lesson, you should be able to: Learn why an understanding of probability is so critically important to the advancement of most kinds of scientific research. Learn the definition of an event. Learn how to derive new events by taking subsets, unions, intersections, and/or complements of already existing events.

  8. Probability Distribution: Definition & Calculations

    A probability distribution function indicates the likelihood of an event or outcome. Statisticians use the following notation to describe probabilities: Advertisement. p (x) = the likelihood that random variable takes a specific value of x. The sum of all probabilities for all possible values must equal 1.

  9. 5.1: Basics of Probability Distributions

    A probability distribution is an assignment of probabilities to the values of the random variable. The abbreviation of pdf is used for a probability distribution function. For probability distributions, 0 ≤ P(x) ≤ 1 and ∑ P(x) = 1 0 ≤ P ( x) ≤ 1 and. ⁡. ∑ P ( x) = 1.

  10. Probability Distribution

    For example, the following notation means "the random variable X follows a normal distribution with a mean of µ and a variance of σ 2 .". There are two types of probability distributions: Discrete probability distributions. Continuous probability distributions.

  11. A Gentle Introduction to Probability Distributions

    A probability distribution is a summary of probabilities for the values of a random variable. As a distribution, the mapping of the values of a random variable to a probability has a shape when all values of the random variable are lined up. The distribution also has general properties that can be measured.

  12. Lesson 2: Properties of Probability

    On the other hand, the probability that you can swim around the world in 30 hours is nearly 0, as is the probability that you will win the lottery someday. I am going to say that the probability that a randomly selected student will get an A in this course is a probability in the 0.20 to 0.30 range.

  13. 5.2: Discrete Probability Distributions

    A valid discrete probability distribution has to satisfy two criteria: 1. The probability of x is between 0 and 1, 0 ≤ P (x i) ≤ 1. 2. The probability of all x values adds up to 1, ∑ P (x i) = 1. Two books are assigned for a statistics class: a textbook and its corresponding study guide.

  14. 4.2: Probability Distributions for Discrete Random Variables

    The variance ( σ2) of a discrete random variable X is the number. σ2 = ∑(x − μ)2P(x) which by algebra is equivalent to the formula. σ2 = [∑x2P(x)] − μ2. Definition: standard deviation. The standard deviation, σ, of a discrete random variable X is the square root of its variance, hence is given by the formulas.

  15. 1.2: Discrete Probability Distribution

    We will explain why in a moment. The probability that heads comes up on the first toss is 1/2. The probability that tails comes up on the first toss and heads on the second is 1/4. The probability that we have two tails followed by a head is 1/8, and so forth. This suggests assigning the distribution function \(m(n) = 1/2^n\) for \(n = 1\), 2 ...

  16. 4.1.1: Discrete Probability Distributions Part 1

    The characteristics of a probability distribution function (PDF) for a discrete random variable are as follows: Each probability is between zero and one, inclusive (inclusive means to include zero and one). The sum of the probabilities is one.

  17. Properties of Probability Distributions (Chapter 1)

    Introduction. Distribution theory is concerned with probability distributions of random variables, with the emphasis on the types of random variables frequently used in the theory and application of statistical methods. For instance, in a statistical estimation problem we may need to determine the probability distribution of a proposed ...

  18. 6.3: The Binomial Distribution

    Using the cumulative distribution table, P(X ≤ 1) = 0.4609. The answer is the smallest number x such that the table entry P(X ≤ x) is at least 0.9500. Since P(X ≤ 2) = 0.7901 is less than 0.95, two parts are not enough. Since P(X ≤ 3) = 0.9547 is as large as 0.95, three parts will suffice at least 95% of the time.

  19. PDF Properties of Probability Distributions

    1.1 Introduction. Distribution theory is concerned with probability distributions of random variables, with the emphasis on the types of random variables frequently used in the theory and application of statistical methods. For instance, in a statistical estimation problem we may need to determine the probability distribution of a proposed ...

  20. PDF Chapter 3. Discrete Random Variables and Their Probability Distributions

    ity mass function (or probability distribution) that gives the probability that Yis exactly equal to some value. (Def 3.2 and 3.3) The probability that a dis-crete Y takes on the value y, P(y) = P(Y = y), is a probability mass function(p.m.f.) (or prob-ability distribution) of Y The expression (Y = y) : the set of all points in S