Advertisement Upgrade to remove ads

what do schedules of reinforcement mean?

the schedules represent different types of behavioral contingencies necessary to get reward
-created by BF Skinner

what are the 4 primary Schedules of reinforcement?

1. Fixed-Ratio (FR)
2. Variable Ratio (VR)
3. Fixed-Interval (FI)
4. Variable-Interval (VI)

What is Fixeed Ratio schedules of reinforcement?

-A constant number of responses is necessary to produce reinforcement
-ex) FR1 (also called continuous reinforcement schedule); each response results in reinforcement.
-FR1: rat in an operant chamber, each barpress will result in one food pellet
-FR10: the rat must respond 10 times; after the 10th response, he gets reinforced

what is a real world example of FR schedule ?

-child receiving a toy from a cereal company after sending in 10 cereal box tops. (FR10)

what are 3 characteristics of FR schedules?

1. FR schedules produce a consistent response rate
2. The rate of responding increases with higher FR schedules.
3. Post reinforcement pause— following reinforcement, responding will temporarily stop. After the pause, the rate of responding resumes to pre-reinforcement levels.

what is a Variable Ration (VR) schedule?

-It is like an FR schedule, except the actual number of responses required to produce reinforcement varies from one reward to the next.
-the VR schedule is described in terms of the average number of responses required to get reinforcement
-For example: VR10 schedule:
It may be setup such that subject has to barpress
9 times on one trial 11 times on the next trial
12, times on next trial
8, on next trial

The average number of responses is 10.

what are the 3 characteristics of Variable Ration schedules of reinforcement?

1. Like FR schedule, VR schedule produces a consistent response rate.
2. The bigger the ratio, the higher the response rate.
3. Usually do not observe a post-reinforcement pause. This results in higher overall response rates than is typically observed on FR schedule.

what is a real world example of Variable Ratio schedule of reinforcement?

-Many forms of gambling are examples of VR schedules. Games of chance such as slot machines, roulette wheels and lotteries all exhibit two important characteristics of VR schedules.
1. The person's chance of winning are directly proportional to the number of times the person plays.
2. The number of responses required for the next reinforcer are uncertain.

what are Fixed Interval Schedules of reinforcement?

-Reinforcement is available only after a specified period of time: the first response performed after the interval has elapsed is reinforced.
-ex)FI 1-min: the first response after a 1 minute is reinforced. FI 2-min; first response after 2 minutes is reinforced.
-The most efficient behavior would be to wait until after the time period has elapsed and then respond.

what are the 3 characteristics of Fixed Interval schedules?

Typically the behavior you see is the following;
1. The subject will pause after reward delivery
2. Then they respond at a slow rate
3. Then right around the time of delivery the rate increases

-this is called the Scallop Effect (they know they have to wait a certain amount of time and after they receive the reward they chill a little then they pick up speed closer to the time that )

what is a real world of a Fixed Interval?

waiting for a bus. As it gets closer to the time the bus is coming you look down the road more

what is a variable interval schedule of reinforcement?

-the VI schedule is similar to the FI schedule except for one thing: the interval of time between periods when reinforcement is available varies
-When we refer to VI schedules we note the average time intervals
-Example: VI 2-min; on the average the rat has to wait 2 minutes before the next response to get reinforcement.

what are the 2 characteristics of Variable Interval schedule?

1. Steady rate of responding
2. Although the average number of responses on a VI schedule typically occurs, just prior to reinforcement, there is NO scallop effect like on the FI schedule.

what is a real world example of Variable Interval schedule of reinforcement?

-checking the mail
-The delivery of mail approximates a VI schedule because: it is unpredictable
-Only one response is requires to get the mail
- Must check it after an unpredictable amount of time

what are the 2 classes of Schedules?

1) ratio schedules (based on the number of responses emitted by the subject)
2) interval schedules (based on time periods between reinforcers)

what are 2 other schedules of Reinforcement?

1) Differential Reinforcement Schedules
2) Compound/Complex schedule (multiple schedule)

what are the 3 types of Differential Reinforcement?

1) differential reinforcement of low responsding schedules (DRL)
2) Differential reinfrocement of high responding schedules (DRH)
3) Differential reinforcement of other behaviors schedule

what is differential reinforcement of low (DRL) responding schedules?

-Reinforcement is contingent upon a low rate of responding. A certain interval of time must elapse without a response, then the first response at the end of the interval is reinforced.
-If a response occurs before completion of the time interval, it resets the clock so the time period starts over.
-ex)DRL 20-sec. Rat must wait 20 seconds before he can respond. If he presses say, after 15 seconds, the clock resets and he has to wait another 20 sec in order to press for reward.

what is a real world example of DRL?

-you go to start your car and you flood the engine. If you try to start it again before waiting several minutes, it will flood again and you have to wait even longer.

what is a differential reinforcement of High Responding?

-Reinforcement is made contingent upon a high rate of responding. The subject must complete a certain number of responses in a specified time period. For example, a reinforcer may occur each time the subject makes 10 responses in 5 seconds.

what is a real world example of DRH?

-teacher gives you 2 days to write a 5 page paper- you will work a lot to complete paper within that time frame.

what is differential reinforcement of Other Behaviors?

-Reinforcement is given only if there is an absence of a particular response in a specified period of time.
-ex) One child constantly hits his younger brother. Parent may tell him that he will get to watch T.V. after dinner (reinforcer) only if he refrains from hitting his brother again (in the time from now until dinner).

what are the 2 types of Compound Complex schedules?

there are schedules which combine 2 or more simple schedules in some way
1) concurrent schedule
2) chained schedules

what is concurrent schedule?

-it is a type of complex/compound schedule
-The subject is presented with two or more response alternatives (eg. 2 levers at the same time), each associated with its own reinforcement schedule.
-ex)Lever 1: FR1 for food
Lever 2: VI 2-min for water
-The experimenter can determine which schedule/reinforcer the rat prefers and how hard he is willing to work for it, relative to another reinforcer.

what is a chain schedule?

-The subject must complete the requirement for two or more simple schedules in a fixed sequence, and each schedule is signaled by a different stimulus.

Example: Chain FR5, FR10

Rat must first complete FR5 on one lever which has a light illuminated over it, then press a second lever which another light over it on an FR10 schedule, THEN he will get reinforced.

how does extinction occur of an instrumental response?

-Extinction occurs if, after the animal has learned the instrumental task, the reward no longer follows the response.
-ex)Rat learned to barpress on an FR1 schedule for food

After learning: each response is no longer followed with food

Result: Extinction

what is the 3 step pattern that extinction of instrumental responses follow?

1. Initially, the animal will barpress at a relatively high rate
2. Then responses will decrease and often become erratic.
Eg; rat may stop bar pressing for awhile, then go back to the lever and try again.
3. Eventually responses will cease altogether; at this point we can say the instrumental response has been extinguished.

what is a real world example of extinction?

ex) if a child wants a candy bar, and mom says no, the child throws a tantrum.
- to extinguish this behavior you let them cry it out

Please allow access to your computer’s microphone to use Voice Recording.

Having trouble? Click here for help.

We can’t access your microphone!

Click the icon above to update your browser permissions above and try again

Example:

Reload the page to try again!

Reload

Press Cmd-0 to reset your zoom

Press Ctrl-0 to reset your zoom

It looks like your browser might be zoomed in or out. Your browser needs to be zoomed to a normal size to record audio.

Please upgrade Flash or install Chrome
to use Voice Recording.

For more help, see our troubleshooting page.

Your microphone is muted

For help fixing this issue, see this FAQ.

Star this term

You can study starred terms together

NEW! Voice Recording

Create Set