Home
Browse
Create
Search
Log in
Sign up
Upgrade to remove ads
Only $2.99/month
Behavior analysis chapter 6
STUDY
Flashcards
Learn
Write
Spell
Test
PLAY
Match
Gravity
Terms in this set (62)
Operant behaviors are influenced by their _______.
Consequences.
Elicited behavior is a function of what _______ it; behavior is a function of what _______.
precedes; follows.
Another name for operant conditioning is _________.
Instrumental.
Thorndike's cats learned to solve the puzzle box problem ________.
Gradually.
Based on his research with cats, Thorndike formulated his famous ______ of ______, which states that lead to a ________ state of affairs are strengthened, while behaviors that lead to an ________ state of affairs are weakened.
law of effect; satisfying; unsatisfying (annoying).
According to Thorndike, behaviors that worked were ____ ______ while behaviors that did not work were ______ ______.
stamped in; stamped out.
The Skinner box evolved out of Skinner's quest for a procedure that would among other things, yield _____ patterns of behavior.
Regular.
In the original version of the Skinner box, rats earn food by _____ a _____; in a later version, pigeons earn a few seconds of access to food by ______ at an illuminated plastic disc known as _____ ______.
Pressing; lever; pecking; response key.
Skinner's procedures are also known as _____ _____ procedure in that the animal controls the rate at which it earns food.
Free operant.
Skinner originally thought all behavior could be explained in terms of _____, but he eventually decided that this type of behavior could be distinguished from another, seemingly more voluntary type of behavior known as _____ behavior.
Reflexes; operant.
Skinner's definition of operant conditioning differs from Thorndike's law of effect in that it views consequences in terms of their effect upon the strength of behavior rather than whether they are _____ or ______.
Satisfying; annoying.
Operant conditioning is similar to the principle of natural selection in that an individual's behaviors that are _____ tend to increase in frequency, while behaviors that are _______ tend to decrease in frequency.
Adaptive; nonadaptive.
The process of operant conditioning involves the following three components; a _____ that produces a certain _____, a _______ that serves to either increase or decrease the likelihood of the _____ that preceded it, and a _____ stimulus that precedes the _____ and signals that a certain _____ is now available.
Response; consequence; consequence; response; discriminative; response; consequence.
Classically conditioned behaviors are said to be ______ by the stimulus, while operant behaviors are said to be ______ by the organism.
Elicited; emitted.
Operant responses are also simply called _____.
Operants.
Operant behavior is usually defined as a ______ of responses rather than a specific response.
Class.
Simply put, reinforces are those consequences that ______ a behavior, while punishers are those consequences that _____ a behavior.
Strengthen; weaken.
More specifically, a reinforcer is a consequence that _____ a behavior and _____ the probability of that behavior. A punisher is a consequence that ____ a behavior and _____ the probability of that behavior.
Follows; increases; follows; decreases.
The terms reinforcement and punishment refer to the _____ or _____ whereby a behavior is strengthened or weakened by its consequences.
Process; procedure (in either order).
Strengthening a roommate's tendency toward cleanliness by thanking her when she cleans the bathroom is an example of ___, while the thanks itself is a ______.
Reinforcement; reinforcer.
Eliminating a dog's tendency to jump up on visitors by scolding her when she does so is an example of ______, while the schooling itself is a _______.
Punishment; punisher.
Reinforcers and punishers are defined entirely by their _____ on behavior. For this reason, the term reinforcer is often preferred to the term _____ because the latter is too closely associated with events that are commonly regarded as pleasant or desirable.
Effect; reward.
When Moe stuck his finger in a light socket, he received an electric shock. As a result, he now sticks his finger in the light socket as often as possible. By definition, the electric shock was a ____ because the behavior it followed has _____ in frequency.
Reinforcer; increased.
Each time Edna talked out in class, her teacher immediately came over and gave her a hug. As a result, Edna no longer talks out in class. By definition, the hug is a ______ because the behavior it follows has _____ in frequency.
Punisher; decreased.
When labeling an operant conditioning procedure, punishing consequences (punishers) are given the symbol _____ (which stands for ______ ______), while reinforcing consequences (reinforcers) are given the symbol ______ (which stands for _____ ______). The operant response is given the symbol ______.
SP; punishing stimulus; SR; reinforcing stimulus; R.
When we give a dog a treat for fetching a toy, are we attempting to reinforce; (a) the behavior of fetching the toy or (b) the dog that fetched the toy ?
The behavior of fetching the toy.
When we chastise a child for being rude, are we attempting to punish; (a) the child who was rude or (b) the child's rude behavior?
The child's rude behavior.
Weakening a behavior through the withdrawal of reinforcement for that behavior is known as ______.
Extinction.
Clayton stopping plugging in the toaster after he received an electric shock while doing so. This is an example of ______. Manzar stopped using the toaster after it no longer made good toast. This is an example of _____.
Punishment; extinction.
The operant conditioning procedure usually consists of three components; (1) a ____ ____, (2) an _____ response, and (3) a _____.
Discriminative stimulus; operant; consequence.
A discriminative stimulus is usually indicated by the symbol _____.
SD
A discriminative stimulus is said to "_____ for the behavior," meaning that its presence makes the response _____ likely to occur.
Set the occasion; more.
A discriminative stimulus _____ elicit behavior in the same manner as a CS.
Does not.
Using the appropriate symbols, label each component in the following three-tern contingency.
Phone rings: Answer phone -> Conversation with friend.
SD;R;SR.
The three-term contingency can also be thought of as an ABC sequence, where A stands for _____ event, B stands for ____, and C stands for _____.
Antecedent; behavior; consequence.
Another way of thinking about the three-term contingency is that you _____ something, _____ something, and _______ something.
Notice; do; get.
A stimulus in the presence of which a response is punished is called a ____ ____ for ____. It can be given the symbol _____.
Discriminative stimulus; punishment; SP.
A bell that signals the start of a round and therefore serves as an SD for the operant response of beginning to box may also serve as a _____ for a fear response. This is an example of how the two processes of _____ conditioning and _____ conditioning often overlap.
CS; classical; operant.
The word positive, when combined with the words reinforcement or punishment, means only that the behavior is followed by the ____ of something. The word negative, when combined with the words reinforcement or punishment, means only that the behavior is followed by the ____ of something.
Presentation; withdrawal.
The word positive, when combined with the words reinforcement or punishment, _____ mean that the consequence is good or pleasant. Similarly, the term negative, when combined with the words reinforcement or punishment _____ mean that the consequence is bad or unpleasant.
Does not; does not.
Within the context of reinforcement and punishment, positive refers to the ______ of something ______, and negative refers to the ______ of something.
Addition; subtraction.
Reinforcement is related to an ______ in behavior, whereas punishment is related to a ______ in behavior.
Increase; decrease.
When you reached toward the dog, he nipped at your hand. You quickly pulled your hand back. As a result, he now nips at your hand whenever you reach toward him. The consequence for the dog's behavior of nipping consisted of the ______ of a stimulus, and his behavior of nipping subsequently _______ in frequency; therefore, this is an example of _____ reinforcement.
Removal; increased; negative.
When the dog sat at your feet and whined during breakfast one morning, you fed him. As a result, he sat at your feet and whined during breakfast the next morning. The consequence for the dog's whining consisted of the ___ of a stimulus, and his behavior of whining subsequently _______ in frequency; therefore, is an example of _______ reinforcement.
Presentation; increased; positive.
Karen cries while saying to her boyfriend, "John, I don't feel as though you love me." John gives Karen a big hug saying, "That's not true, dear, I love you very much." If John's hug is a reinforcer, Karen is _______ likely to cry the next time she feels insecure about her relationship. More specifically, this is an example of ________ reinforcement of Karen's crying behavior.
More; positive.
With respect to escape and avoidance, an _____ response is one that terminates an aversive stimulus, while an _____ response is one that prevents an aversive stimulus from occurring. Escape and avoidance responses are two classes of behavior that are maintained by ____ reinforcement.
Escape; avoidance; negative.
Turning down the heat because you are too hot is an example of an _____ response; turning it down before you become too hot is an example of an _____ response.
Escape; avoidance.
When Sasha was teasing the dog, it bit her. As a result, she no longer teases the dog. The consequence for Sasha's behavior of teasing the dog was the ______ of a stimulus, and the teasing behavior subsequently ____ in frequency; therefore, this is an example of ______ _____.
Presentation; decreased; positive punishment.
Whenever Sasha pulled the dog's tail, the dog left and went into another room. As a result, Sasha now pulls the dogs tail less often when it is around. The consequence for pulling the dog's tail was the _____ of a stimulus, and the behavior of pulling the dog's tail subsequently _____ in frequency; therefore, this is an example of _______ _______.
Removal; decreased; negative punishment.
When Alex burped in public during his date with Stephaine, she got angry with him. Alex now burps quite often when he is out on a date with Stephanie. The consequence for burping was the ____ of a stimulus, and the behavior of belching subsequently ______ in frequency; therefore, this is an example of ________ _______.
Presentation; increased; positive reinforcement.
When Alex held the car door open for Stephanie, she made a big fuss over what a gentleman he was becoming. Alex no longer holds the car door open for her. The consequence for holding open the door was the _____ of a stimulus, and the behavior of holding open the door subsequently _______ in frequency; therefore, this is an example of _______ ______.
Presentation; decrease; positive punishment.
When Tenzing shared his toys with his brother, his mother stopped criticizing him. Tenzing now shares his toys with his brother quite often. The consequence for sharing the toys was the _____ of a stimulus, and the behavior of sharing the toys subsequently _______ in frequency; therefore, this is an example of ______ _______.
Removal; increased; negative reinforcement.
Which of the following most closely parallels what happens in a Skinner box?
a. You are at home watching television and raiding the refrigerator.
b. You are at work with lots to do. Meals are served at fixed times during the day.
c. You are in prison with nothing to do. Meals are served at fixed times during the day.
d. You are in your apartment with nothing to do but bake cookies and eat them.
D.) You are in your apartment with nothing to do but bake cookies and eat them
The behavior of lever pressing for food is said to be
a. emitted by the rat.
b. elicited by the food.
c. emitted by the food.
d. elicited by the rat.
A.) Emitted by the rat
From an operant conditioning perspective, chocolate is a reinforcer if it
a. both strengthens the behavior that precedes it and elicits salivation.
b. strengthens the behavior that follows it.
c. strengthens the behavior that precedes it.
d. elicits salivation.
C.) Strengthens the behavior that precedes it
A dog is given a treat each time it comes when called, and as a result no longer comes when called. The ________ is an example of ________.
a. treat; negative reinforcement
b. treat; punishment
c. treat; a punisher
d. decrease in behavior; a punisher
C.) Treat; Punisher
When Hai visits his parents, he whines a lot about how unappreciated he is at work. It seems likely that the presence of his parents is ________ for whining.
a. reinforcement
b. a conditioned stimulus
c. a reinforcer
d. a discriminative stimulus
D.) A Discriminative Stimulus
"I'll do anything to avoid housework." This statement speaks to the power of
a. negative punishment.
b. positive punishment.
c. positive reinforcement.
d. negative reinforcement.
D.) Negative Reinforcement
Jim compliments his secretary on her sexy new outfit when she offers to bring him coffee one morning. She never again offers to bring him coffee. Out of the following, this is an example of which type of process?
a. negative reinforcement
b. positive reinforcement
c. positive punishment
d. negative punishment
C.) Positive Punishment
When Sean doesn't cry, he doesn't get an extra helping of dessert. As a result, he always cries at the dinner table. This is best interpreted as an example of
a. negative punishment.
b. positive reinforcement.
c. extinction.
d. positive punishment.
B.) Positive Reinforcement
Suzie notices that her daughter Nina loves to play piano. Suzie decides to encourage her further by promising to pay her a dollar for every extra hour of piano practice in the evening. Chances are that Nina's intrinsic interest in playing the piano will likely
a. decrease.
b. remain unchanged.
c. increase.
d. both increase and remain unchanged.
A.) Decrease
Which of the following is an example of shaping?
a. Reinforcing the rat for gradual approximations to lever pressing.
b. Reinforcing the behavior of lever pressing.
c. Gradual reinforcement for lever pressing.
d. Reinforcing gradual approximations to lever pressing.
D.) Reinforcing Gradual Approximations to Lever Pressing
YOU MIGHT ALSO LIKE...
Quiz 6 - Psych of Learning
50 terms
Psychology of Learning - Quiz 6
50 terms
Learning Chapter 8: Operant Learning: Punishment
74 terms
Exam 1
251 terms
OTHER SETS BY THIS CREATOR
Abnormal Behavior Final Exam
11 terms
Drugs and behavior Exam 3
44 terms
Child Psychology: Exam 3
14 terms
Child psychology Exam 2
52 terms
OTHER QUIZLET SETS
Learning & Behavior by Mazur
47 terms
Chapter 9 & 10 test prep
40 terms
Learning & Behavior by Mazur
32 terms
Exam 2 POSC 270 Quiz 7
16 terms