31 terms

MAT 223 Exam #2 Vocabulary Review

probability experiment
any process with a result determined by chance
sample space
the set of all possible outcomes of an experiment
a subset of a sample space
tree diagram
a kind of diagram with branching that allows outcomes to be organized and counted
subjective probability (personal)
an educated guess regarding an outcome of an experiment, particularly one that can't be (easily) repeated
experimental probability (empirical)
the proportion of outcomes belonging to an event to the total number of experimental trials
classical probability (theoretical)
the proportion of outcomes in an event to the total number of elements in the sample space (if all the events in the sample space are equally likely); a probability determined from knowledge of the world rather than experiment
The Law of Large Numbers
after a large number of experiments, the experimental probability will approximate the classical probability... and will get closer as the number of experiments increases
if E is an event, then this is 1-P(E)
the sum all probabilities must add to
the probability of an impossible event
the probability of a certain event
mutually exclusive
the probability of the intersection of two events is 0//two sets that have no overlap
the two experiments do not effect the outcome of the other
the two experiments do effect the outcome of each other
conditional probability
the probability of an event after another dependent event has occurred
Fundamental Counting Principle//Multiplication Rule
a rule for calculating elements in a set (sample space or event) for sample spaces illustrated by tree diagrams, or where repeated use of the same outcome in successive experiments is allowed
multiply all the numbers from n down to 1
a counting rule for when repetition is not allowed, and order doesn't matter
a counting rule for when repetition is not allowed, but order does matter
special permutations
a counting rule used when order matters and no repetition is allowed, but some elements can't be distinguished from each other
random variable
a variable whose outcome is determine by chance
discrete random variable
a random variable whose outcome can take on only some particular (usually numerical) values
probability distribution
a table or formula that gives the probabilities for every value of a random variable
expected value
the mean of the probability distribution: the expected mean of repeated trials of the experiment
variance of a discrete random variable
a measure of the spread of a discrete probability distribution
binomial distribution
a discrete random variable distribution where trials are fixed in number, independent and have only two outcomes
the outcome of interest in a binomial distribution
normal distribution
a continuous probability distribution defined only by its mean and standard deviation
density curve
the curve that defines the probability of a continuous distribution (area under the curve is equivalent to the probability)
continuity correction
the modification to the boundaries of a binomial range when estimating with the normal distribution