According to ____, rather than saying that the child's behavior was reinforced by a treat, we should instead say the child's behavior was reinforced by eating a treat.
A schedule in which the response requirement changes as a function of the organism's performance while responding for the previous reinforcer.
Among the four basic intermittent schedules, fixed schedules are more likely to produce _____ than variable schedules, and ratio schedules are more likely to produce _____ than interval schedules.
post-reinforcement pauses: higher rates of response
Anson worked very hard at his courses at the start of the term. Unfortunately, he received only mediocre marks. As a result, by the end of the term, he was hardly studying. This seems to be an example of
As I look over the menu in a restaurant, I select a cheeseburger rather than spaghetti and meatballs. Both meals are approximately the same portion size and each provides the same caloric value. Which of the following theories is most useful in explaining why I choose one food over another?
behavioral bliss point approach
The theory that an organism with free access to alternative activities will distribute its behavior in such a way as to maximize overall reinforcement.
A schedule consisting of a sequence of two or more simple schedules, each with its own SD and the last of which results in a terminal reinforcer.
A type of complex schedule in which the requirements of two or more simple schedules must be met before a reinforcer is delivered.
David finds that the best way to conserve energy during a long bike trip is to pedal at a moderate but very steady pace. This is an example of a ____ schedule of reinforcement
differential reinforcement of high rates (DRH)
A schedule in which reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time—or, more generally, reinforcement is provided for responding at a fast rate.
differential reinforcement of low rates (DRL)
A schedule in which a minimum amount of time must pass between each response before the reinforcer will be delivered—or, more generally, reinforcement is provided for responding at a slow rate.
differential reinforcement of paced responding (DRP)
A schedule in which reinforcement is contingent upon emitting a series of responses at a set rate—or, more generally, reinforcement is provided for responding neither too fast nor too slow.
drive reduction theory
According to this theory, an event is reinforcing to the extent that it is associated with a reduction in some type of physiological drive.
Each time the dog scratches a the door, its owner lets it outside. In technical terms, this is an example of a(n) ____ schedule of reinforcement.
fixed duration (FD) schedule
A schedule in which reinforcement is contingent upon continuous performance of a behavior for a fixed, predictable period of time.
fixed interval (FI) schedule
A schedule in which reinforcement is contingent upon the first response after a fixed, predictable period of time.
fixed ratio (FR) schedule
A schedule in which reinforcement is contingent upon a fixed, predictable number of responses.
fixed time (FT) schedule
A schedule in which the reinforcer is delivered following a fixed, predictable period of time, regardless of the organism's behavior.
goal gradient effect
An increase in the strength and/or efficiency of responding as one draws near to the goal.
If an animal is unable to reach its behavioral bliss point, it will
distribute its behavior to draw as close to it as possible
If Battu practices piano for one hour without a break, his mother allows him to watch television. This is an example of an ____ schedule.
In a certain experiment, reinforcement is contingent upon each response being separated by at least a certain period of time. This sounds like a ____ schedule
In a chained schedule, the consequences for the early behaviors in the chain function as ____ that also serve as ____ for the next behavior.
secondary reinforcers; discriminative stimuli
In you check yoru phone to see if you have text messages, you sometimes have one. You can't make text messages faster by checking more frequently! Which of the following schedules of reinforcement fits this example?
Motivation derived from some property of the reinforcer, as opposed to an internal drive state.
intermittent (or partial) reinforcement schedule
A schedule in which only some responses are reinforced.
Jumper, the Bettea Splendens that Russ has in hi office, first learned to approach the toothpick in order to get food. Once this was accomplished, it then had to tap it with its snout to get food. This is an example of a(n) ____ schedule of reinforcement.
The most "exciting" romantic relationship would likely be one that is being maintained on a ____ schedule
noncontingent schedule of reinforcement
A schedule in which the reinforcer is delivered independently of any response.
On a(n) ______ type of schedule, the attainment of one reinforcer means that the next reinforcer is necessarily some distance away.
paA disruption in responding due to an overly demanding schedule is technically known as
The pigeon must turn circles continuously for an average of 10 seconds in order to earn access to food. This is an example of a _____ schedule
The notion that a high-probability behavior can be used to reinforce a low-probability behavior.
response deprivation hypothesis
The notion that a behavior can serve as a reinforcer when: (1) access to the behavior is restricted and (2) its frequency thereby falls below its preferred level of occurrence.
A schedule in which reinforcement is directly contingent upon the organism's rate of response.
Superstitious behavior often develops as a function of exposure to a ___ schedule of reinforcement
This schedule produces a moderate, steady rate of response with little or no post-reinforcement pause.
variable duration (VD) schedule
A schedule in which reinforcement is contingent upon continuous performance of a behavior for a varying, unpredictable period of time.
variable interval (VI) schedule
A schedule in which reinforcement is contingent upon the first response after a varying, unpredictable period of time.
variable ratio (VR) schedule
A schedule in which reinforcement is contingent upon a varying, unpredictable number of responses.
variable time (VT) schedule
A schedule in which the reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism's behavior.
When Justin gets up in the morning, he showers, shaves, gets dressed, eats breakfast, and then drives to work. In this chain of behaviors, the terminal reinforcer is
arriving at work
Whenever possible, Boris likes to work four hours in the morning and then play gold four hours in the afternoon. This is distribution of behavior represents his ____ point of these two behaviors.
Which of the following schedules would produce the longest pause after each reinforcement? FR40 VR30 VR60 FR10
Which schedule of reinforcement produces a steady-state of behavior that appears to be "scalloped"?
Yesterday morning at the bus stop, Joseph just happened to be leaning against the bus stop sign when the bus finally arrived. This morning, Joseph spent a lot of time learning against the sign. This seems to be an example of a(n) ____ behavior that has been reinforced on a(n) ____ schedule