Chapter 6 AP STATS

STUDY
PLAY

Terms in this set (...)

Binomial coefficient
The number of ways of arranging k successes among n observations is given by the binomial coefficient
Binomial distribution
In a binomial setting, suppose we let X = the number of successes. The probability distribution of X is a binomial distribution with parameters n and p, where n is the number of trials of the chance process and p is the probability of a success on any one trial. The possible values of X are the whole numbers from 0 to n.
Binomial probability
If X has the binomial distribution with n trials and probability p of success on each trial, the possible values of X are 0, 1, 2, ..., n.
Binomial random variable
The count X of successes in a binomial setting.
Binomial setting
Arises when we perform several independent trials of the same chance process and record the number of times that a particular outcome occurs. The four conditions for a binomial setting are:
• Binary? The possible outcomes of each trial can be classified as "success" or "failure."
• Independent? Trials must be independent; that is, knowing the result of one trial must not have any effect on the result of any other trial.
• Number? The number of trials n of the chance process must be fixed in advance.
• Success? On each trial, the probability p of success must be the same.
Continuous random variable
Takes all values in an interval of numbers. The probability distribution of a continuous random variable is described by a density curve. The probability of any event is the area under the density curve and above the values of the variable that make up the event.
Discrete random variable
Takes a fixed set of possible values with gaps between. The probability distribution of a discrete random variable gives its possible values and their probabilities. The probability of any event is the sum of the probabilities for the values of the variable that make up the event.
Factorial
For any positive whole number n, its factorial n! is
n! = n ∙ (n − 1) ∙ (n − 2) ∙ ... ∙ 3 ∙ 2 ∙ 1
In addition, we define 0! = 1.
Geometric distribution
In a geometric setting, suppose we let Y = the number of trials required to get the first success. The probability distribution of Y is a geometric distribution with parameter p, the probability of a success on any trial. The possible values of Y are 1, 2, 3, ....
Geometric random variable
The number of trials Y that it takes to get a success in a geometric setting.
Geometric setting
A geometric setting arises when we perform independent trials of the same chance process and record the number of trials until a particular outcome occurs. The four conditions for a geometric setting are:
• Binary? The possible outcomes of each trial can be classified as "success" or "failure."
• Independent? Trials must be independent; that is, knowing the result of one trial must not have any effect on the result of any other trial.
• Trials? The goal is to count the number of trials until the first success occurs.
• Success? On each trial, the probability p of success must be the same.
Geometric probability
If Y has the geometric distribution with probability p of success on each trial, the possible values of Y are 1, 2, 3, ....
Independent random variables
If knowing whether any event involving X alone has occurred tells us nothing about the occurrence of any event involving Y alone, and vice versa, then X and Y are independent random variables. That is, there is no association between the values of one variable and the values of the other.
Linear transformation
Linear transformation A linear transformation of a random variable involves adding a constant a, multiplying by a constant b, or both. We can write a linear transformation of the random variable X in the form Y = a + bX.
Mean (expected value) of a random variable
The mean of a random variable X, denoted by Xμ, is the balance point of the probability distribution histogram or density curve. Since the mean is the long-run average value of the variable after many repetitions of the chance process, it is also known as the expected value E(X) of the random variable.
Mean (expected value) of a discrete random variable X
To find the mean (expected value) of X, multiply each possible value by its probability, then add all the products
Mean of the sum (difference) of random variables
For any two random variables X and Y, if T = X + Y then the mean of T is TXYμμμ=+ If D = X - Y , then the mean of D is DXYμμμ=−. In general, the mean of the sum (difference) of several random variables is the sum (difference) of their means.
Mean (expected value) of a geometric random variable
If Y is a geometric random variable with probability of success p on each trial, then its mean (expected value) is 1()YEYpμ==. That is, the expected number of trials required to get the first success is 1/p.
Mean and standard deviation of a binomial random variable
If a count X has the binomial distribution with number of trials n and probability of success p, the mean and standard deviation of X are
μX=np
Normal approximation for binomial distributions
If X is a count having the binomial distribution with parameters n and p, then when n is large, X is approximately Normally distributed with mean np and standard deviation (1)npp−. We will use this approximation when np ≥ 10 and n(1 - p) ≥ 10.
Probability distribution
The probability distribution of a random variable gives its possible values and their probabilities.
Random Variable
Takes numerical values that describe the outcomes of some chance process.
Standard deviation of a random variable
The square root of the variance of a random variable 2Xσ. The standard deviation measures the variability of the distribution about the mean.
Variance of a random variable
The average squared deviation of the values of the variable from their mean.
Variance of the sum (difference) of independent random variables
For any two independent random variables X and Y, if T = X + Y, then the variance of T is 222TXYσσσ=+. If D = X - Y, then the variance of D is 222DXYσσσ