Born in Pennsylvania! Burrhus Frederic Skinner was born March 20, 1904, in the small Pennsylvania town of Susquehanna. His father was a lawyer, and his mother a strong and intelligent housewife. His upbringing was old-fashioned and hard-working. Burrhus was an active, out-going boy who loved the outdoors and building things, and actually enjoyed school. His life was not without its tragedies, however. In particular, his brother died at the age of 16 of a cerebral aneurysm. Burrhus received his BA in English from Hamilton College in upstate New York. He didn't fit in very well, not enjoying the fraternity parties or the football games. He wrote for school paper, including articles critical of the school, the faculty, and even Phi Beta Kappa! To top it off, he was an atheist -- in a school that required daily chapel attendance. He wanted to be a writer and did try, sending off poetry and short stories. When he graduated, he built a study in his parents' attic to concentrate, but it just wasn't working for him. Ultimately, he resigned himself to writing newspaper articles on labor problems, and lived for a while in Greenwich Village in New York City as a "bohemian." After some traveling, he decided to go back to school, this time at Harvard. He got his masters in psychology in 1930 and his doctorate in 1931, and stayed there to do research until 1936. Also in that year, he moved to Minneapolis to teach at the University of Minnesota. There he met and soon married Yvonne Blue. They had two daughters, the second of which became famous as the first infant to be raised in one of Skinner's inventions, the air crib. Although it was nothing more than a combination crib and playpen with glass sides and air conditioning, it looked too much like keeping a baby in an aquarium to catch on. In 1945, he became the chairman of the psychology department at Indiana University. In 1948, he was invited to come to Harvard, where he remained for the rest of his life. He was a very active man, doing research and guiding hundreds of doctoral candidates as well as writing many books. While not successful as a writer of fiction and poetry, he became one of our best psychology writers, including the book Walden II, which is a fictional account of a community run by his behaviorist principles. August 18, 1990, B. F. Skinner died of leukemia after becoming perhaps the most celebrated psychologist since Sigmund Freud.
Theory (B. F. Skinner)
his entire system is based on operant conditioning.
The organism is in the process of "operating" on the environment, which in ordinary terms means it is bouncing around its world, doing what it does. During this "operating," the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant -- that is, the behavior occurring just before the reinforcer. This is operant conditioning: "the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future."
Reinforcing stimulus (Skinner)
a reinforcer that increases the operant
the behavior occurring just before the reinforcer
a behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future. What if you don't give the rat any more pellets? Apparently, he's no fool, and after a few futile attempts, he stops his bar-pressing behavior. This is called extinction of the operant behavior.
What happens when you reintroduce the reinforcing stimulus after extinction?
Now, if you were to turn the pellet machine back on, so that pressing the bar again provides the rat with pellets, the behavior of bar-pushing will "pop" right back into existence, much more quickly than it took for the rat to learn the behavior the first time. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the rat was reinforced for pushing on the bar!
Spontaneous recovery (Skinner)
Subject has to be in extinction. It is an increase in responding at the beginning of a new session when no reinforcement is provided.
Schedules of reinforcement
Skinner likes to tell about how he "accidentally -- i.e. operantly -- came across his various discoveries. For example, he talks about running low on food pellets in the middle of a study. Now, these were the days before "Purina rat chow" and the like, so Skinner had to make his own rat pellets, a slow and tedious task. So he decided to reduce the number of reinforcements he gave his rats for whatever behavior he was trying to condition, and, lo and behold, the rats kept up their operant behaviors, and at a stable rate, no less. This is how Skinner discovered schedules of reinforcement!
Schedule of reinforcement - Continuous Reinforcement (Skinner)
Continuous reinforcement is the original scenario: Every time that the rat does the behavior (such as pedal-pushing), he gets a rat goodie.
Schedule of reinforcement - Fixed ratio schedule (Skinner)
The fixed ratio schedule was the first one Skinner discovered: If the rat presses the pedal three times, say, he gets a goodie. Or five times. Or twenty times. Or "x" times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc. This is a little like "piece rate" in the clothing manufacturing industry: You get paid so much for so many shirts. If you do homework for 30 minutes, you get to take a 5 minutes dance party break.
Ratio- The number of responses made before reinforcement is given.
Schedule of Reinforcement - Fixed Interval Schedule (Skinner)
uses a timing device of some sort. If the rat presses the bar at least once during a particular stretch of time (say 20 seconds), then he gets a goodie. If he fails to do so, he doesn't get a goodie. But even if he hits that bar a hundred times during that 20 seconds, he still only gets one goodie! One strange thing that happens is that the rats tend to "pace" themselves: They slow down the rate of their behavior right after the reinforcer, and speed up when the time for it gets close.
- The amount of time that has to elapse before response will give you a reinforcement.
Schedule of Reinforcement - Variable Schedules (Skinner)
Variable ratio means you change the "x" each time -- first it takes 3 presses to get a goodie, then 10, then 1, then 7 and so on. Variable interval means you keep changing the time period -- first 20 seconds, then 5, then 35, then 10 and so on. In both cases, it keeps the rats on their rat toes. With the variable interval schedule, they no longer "pace" themselves, because they can no longer establish a "rhythm" between behavior and reward. Most importantly, these schedules are very resistant to extinction. It makes sense, if you think about it. If you haven't gotten a reinforcer for a while, well, it could just be that you are at a particularly "bad" ratio or interval! Just one more bar press, maybe this'll be the one!
Mechanism of gambling (Skinner)
Part of variable schedules. You may not win very often, but you never know whether and when you'll win again. It could be the very next time, and if you don't roll them dice, or play that hand, or bet on that number this once, you'll miss on the score of the century!
Continuous reinforcement (Skinner)
- any response will give the subject a reinforcement. CRF (shorthand for Continuous Reinforcement)
Fixed ratio schedule- FR___ (Skinner)
if the 5 is on the line, the subject has to make that number of reinforcements before it gets a reinforcement. It takes the least amount of time to initiate extinction with this schedule- it is the easiest.
Variable ratio schedule- VR___ (Skinner)
if 5, (variable means average) then the animal has to make an average of five responses- could make 2 one time, then 7 the next, etc.
Fixed interval schedule- FI___(Skinner)
if 5, then the animal has to wait 5 minutes, make a response, and then get a reinforcement.
Variable interval schedule- VI___ (Skinner)
if 5, the animal has to wait an average of five minutes to get a reinforcement). This is the hardest to initiate extinction because these subjects have developed a lot of patience.
Scalloping design (Skinner)
- fixed interval- it responds rapidly, then does nothing for a while, then responds rapidly.
How many reinforcements do rats get per day?
Most rats get about 75 reinforcements per day. All rat pellets weigh about .10 grams. 7.5 grams is enough to maintain a rat per day.
method of successive approximations, you put some food pellets inside of the food tray, eventually the rat finds and eats the food then goes back to the corner where it feels comfortable and safe. You make the rat sit closer and closer to the food tray in order to get the food. Eventually, you want the rat to be sitting right next to the bar, that way it comes to hit the bar on its own. You condition the rat to do a little bit more until it goes from sitting in the corner not engaging in the experiment to hitting the bar and knowing that food will come out.
Joseph Wolpe - systematic desensitization (Skinner)
if you have a phobia, you can use systematic desensitization to get over the phobia. E.g. for a fear of heights- you would stand outside a large building, when you feel comfortable, move up a floor and so on until you are not afraid to be on the highest story of a skyscraper.
Aversive stimulus (Skinner)
is the opposite of a reinforcing stimulus, something we might find unpleasant or painful. E.g. lithium chloride for sheep. A behavior followed by an aversive stimulus results in a decreased probability of the behavior occurring in the future.
- Skinner first thought that you could learn as much from punishment as from positive reinforcement, but later changed his mind, now believing that punishment does not work. This is because in punishment, the control is now put in the hands of the experimenter, not the hands of the subject. What is more beneficial is negative reinforcement. E.g. "Go to your room" as a punishment for your children.
Negative reinforcement (Skinner)
- behavior followed by the removal of an aversive stimulus results in an increased probability of that behavior occurring in the future.
E.g. Go in your room until you understand what you did wrong
then come and explain it to me. The control is back in the hands of the subjects.
Behavior modification (Skinner)
often referred to as b-mod -- is the therapy technique based on Skinner's work. It is very straight-forward: Extinguish an undesirable behavior (by removing the reinforcer) and replace it with a desirable behavior by reinforcement. It has been used on all sorts of psychological problems -- addictions, neuroses, shyness, autism, even schizophrenia -- and works particularly well with children. There are examples of back-ward psychotics who haven't communicated with others for years who have been conditioned to behave themselves in fairly normal ways, such as eating with a knife and fork, taking care of their own hygiene needs, dressing themselves, and so on.
There is an offshoot of b-mod called the token economy. This is used primarily in institutions such as psychiatric hospitals, juvenile halls, and prisons. Certain rules are made explicit in the institution, and behaving yourself appropriately is rewarded with tokens -- poker chips, tickets, funny money, recorded notes, etc. Certain poor behavior is also often followed by a withdrawal of these tokens. The tokens can be traded in for desirable things such as candy, cigarettes, games, movies, time out of the institution, and so on. This has been found to be very effective in maintaining order in these often difficult institutions.
There is a drawback to token economy: When an "inmate" of one of these institutions leaves
they return to an environment that reinforces the kinds of behaviors that got them into the institution in the first place. The psychotic's family may be thoroughly dysfunctional. The juvenile offender may go right back to "the 'hood." No one is giving them tokens for eating politely. The only reinforcements may be attention for "acting out," or some gang glory for robbing a Seven-Eleven. In other words, the environment doesn't travel well!
Walden II (Skinner)
Skinner started his career as an English major, writing poems and short stories. He has, of course, written a large number of papers and books on behaviorism. But he will probably be most remembered by the general run of readers for his book Walden II, wherein he describes a utopia-like commune run on his operant principles.
Beyond Freedom and Dignity (Skinner)
People, especially the religious right, came down hard on his book. They said that his ideas take away our freedom and dignity as human beings. He responded to the sea of criticism with another book (one of his best) called Beyond Freedom and Dignity.
What do we mean when we say we want to be free?
Skinner:What do we mean when we say we want to be free?
Usually we mean we don't want to be in a society that punishes us for doing what we want to do. Okay -- aversive stimuli don't work well anyway, so out with them! Instead, we'll only use reinforcers to "control" society. And if we pick the right reinforcers, we will feel free, because we will be doing what we feel we want!
Skinner:When we say
"they died with dignity," what do we mean?,Likewise for dignity. When we say "she died with dignity," what do we mean? We mean she kept up her "good" behaviors without any apparent ulterior motives. In fact, she kept her dignity because her reinforcement history has led her to see behaving in that "dignified" manner as more reinforcing than making a scene.
Skinner:Can we truly design culture?
The bad do bad because the bad is rewarded. The good do good because the good is rewarded. There is no true freedom or dignity. Right now, our reinforcers for good and bad behavior are chaotic and out of our control -- it's a matter of having good or bad luck with your "choice" of parents, teachers, peers, and other influences. Let's instead take control, as a society, and design our culture in such a way that good gets rewarded and bad gets extinguished! With the right behavioral technology, we can design culture.
Mentalistic constructs (Skinner)
Both freedom and dignity are examples of what Skinner calls mentalistic constructs -- unobservable and so useless for a scientific psychology. Other examples include defense mechanisms, the unconscious, archetypes, fictional finalisms, coping strategies, self-actualization, consciousness, even things like hunger and thirst. The most important example is what he refers to as the homunculus -- Latin for "the little man" -- that supposedly resides inside us and is used to explain our behavior, ideas like soul, mind, ego, will, self, and, of course, personality. Instead, Skinner recommends that psychologists concentrate on observables, that is, the environment and our behavior in it.