41 terms

Chapter 6- Response Rate Schedules

Response Rate Schedules
A rule about the delivery of reinforcers
Not all instances of responses produce a reinforcer
Different Schedules of reinforcement
Schedule of Reinforcement Effects
Rate of responding
Pattern of responding
SRE are predictable
Associated with Skinner
Simple Schedules
Ratio Schedules
rule is based on number of responses
Ex) amount of work
Interval schedules
rule is based on time, plus the individual does need to emit 1 response
Time schedules
Rule is based on amount of time, no response requirement
Cumulative Recorder
If organism responds-needle moves up
If organism doesn't respond-needle stays in place
slope of line-tells rate of responding
flatter slope-slower rate of responding
downward inflection-tells you reinforcer was given
Ratio Schedules
Depends on # of responses
2 types: Fixed Ratio & Variable Ratio
Fixed Ratio Schedules
FR #= number of responses required for reinforer to delivered
FR1= CRF: continuous reinforcement schedule
Pattern of responding of FR schedules
The beginning of the interval, no responding at first, then organism begins to respond on a steady high rate
Post Reinforcement Pause
pause at beginning of trial
The size of the post reinforcement pause is determined by the size of the FR requirement (the interval)
Longer interval will have longer pause
Ratio Run
high steady rate
Ratio Strain
Increase gradually from FR1 to FR75
if you don't might result in extinction
If you make the jump too quickly may get ratio strain
Instead of getting high steady run, might get some breaks
Variable Ratio Schedule
VR#= average requirement for reinforce to be delivered
Unpredictable, don't know when reinforce will be delivered
Pattern & Rate of responding for VR Schedules
Pattern of responding -No post reinforcement pause
•Steady pattern - is constant throughout the interval
Rate of responding is About the same as FR, higher rate than interval schedules
Example of VR Schedule
Custodian—walks into a room, knows they are going to clean the room, however they don't know how much cleaning needs to be done, never quite sure, it's a gamble
Interval Schedules
A certain amount of time has to passed before a behavior can be reinforced
2 Types: Fixed Interval and Variable Interval
Fixed Interval Schedules
Requires a certain time
FI #: after #sec, reinforcer will become available
#sec +1 response
Only one response is required
the interval of time is fixed, it never varies
Pattern & responding of FI schedules
Pattern-Post reinforcement paused, followed by a gradually increasing rate until the reinforcer is delivered
Rate- going to be fastest right before the reinforcer is delivered
Rate is lower than ratio schedule—b/c ratio schedule dep on # of responses, interval schedule is dep on time
Post Reinforcement Interval
The duration of the post reinforcement pause is determined by the interval requirement
Longer intervals will have longer post reinforcement pause
Limited Hold
Reinforcer being held, reinforcer only available for a limited time
If the reinforcer not picked up, then a new trial will begin
changes the rate and pattern of responding
Limited Hold-changes pattern & rate
When you have this limited hold behavior on this fixed interval schedule looks a lot like a fixed ratio schedule
Fixed ratio schedule remains responses remain steady and high
Can get performance that looks a lot more like a fixed ratio schedule
Variable Interval Schedule
VI # - Average time interval the reinforcer becomes available
Reinforcer will become available on average after # min
Delivery of reinforce is not predictable
Pattern & Rate of responding of VI Schedules
oNo post reinforcement pause
Rate -Lower than a ratio schedule
Example of VI Schedule
When you call someone and they are on phone, you will get a busy signal. You don't know when they are going to get off the phone so you have to redial the #. Don't know how many times you are going to have to redial the #.
Fixed Vs Variable
overall rates of responding about the same, as long as the requirements are equal
Fixed -Post reinforcement pause (doesn't matter if ratio or interval)
Steady increase in rate until the reinforcer is delivered
Variable -Steady pattern of responding
Ratio Vs Interval
Ratio -Higher rate
Variable - Lower rate of responding
Reynolds (1975)
Could it be that reinforcement rate is different between the two animals
Pigeon 1 - VR -Programmed by experimenter
Pigeon 2 - VI -Hold rate of reinforcement constant, Reinforce determined by pigeon 1-yoked to pigeon 1
Why do ratio intervals have higher rates than
In ratio—the more you respond the more reinforcers you receive
Two theories
1. Reinforcer (Sr) of inter- response times
2. Feedback function
Reinforcer of Inter-response times
ratio schedule-inter-response times are shorter b/c the faster the organism works the faster the reinforcer is delivered
reinforcer reinforces the response and the time between the responses ( inter-response time that proceeds it)
ratio schedule-inter-response times are very short, so short inter-response time are reinforced
interval schedules longer inter-response times are likely to occur, so longer inter-response times are reinforced.
Feedback Function
The reinforce serves as feedback (information about an individuals' performance in that experimental session)
Rate Schedules
A specified rate of responding is the requirement
Two basic schedules
1. Differential reinforcement of high rate
2.Differential reinforcement of low rates of behavior
Differential reinforcement of high rate (DRH)
Builds high rates of behaviors
Requirement: is atleast X number of responses in this interval must occur
Can exceed the rate
Ex) atleast 60 responses/min must occur
Differential reinforcement of low rate (DRL)
About slowing things down, want to put some space in between responses, somehow want to decrease rates
Time Schedules
Aka noncontigent sr
Only contigent on time, no response required
Have a FT 1 min, every one minute a reinforce will be delivered, independent of response Can be variable, can be fixed
can lead to superstitious behavior
Choice behavior
Concurrent- two schedules of reinforcement that are independent and available at the same time
Concurrent chain schedule-illuminated and that key is associated with a particular schedule or reinforcement
Matching Law
B= Behavior
R= rate of reinforcer
•The relative rate of responding on one alternative is equal to the relative rate of reinforcement on that alternative

B= behavior (response)
R= rate of reinforcer
Generalized matchng law
Baum (1974)
Due to sensitivity differences in the schedule can have:
Undermatching—the organism behaviors closer to an indifference
organism behaviors closer to an indifference
Behaving s though more of a difference than there really is
super sensitive
Overdoing it
Self Control
Wait longer for a larger reinforcer
As time goes the value of a reward decays over time in a hyperbolic fcn
The exact fcn itself depensd on the individual—diff people have diff hyperbo fcn
Either immediate or has no value
Immediately that small reward has more reinforcing value, the longer you wait the less value
Look at reward fcn in boodk