How can we help?

You can also find more resources in our Help Center.

762 terms

Combo with Unit 8 and 3 others

Learning PSychology 3011
STUDY
PLAY
A fractional anticipatory goal response is most firmly connect to the ____?
Goal stimulus
The fact that organisms explore and manipulate____?
Provides little difficulty for drive theorists
____ is the product of learning and motivation
Excitatory reaction potential
Hull proposed that the world affecting us be represented as a sequence of ____?
Stimuli
Watson would have objected most strongly to hull use of ____?
Intervening variables
The phenomenon of specific hunger seemed to richter to be strong evidence for the fundamental importance of _____?
Biological drive
The reason we do not show full blown goal responses when we begin a familiar sequence of behavior is that extinction____?
eliminates them
The dissipation of inhibition over time was used by hull to explain____
reminiscence
Hull believed that we like other organisms, consits of muscle, blood, bone, nerves, and visceral organs. He believed that being human we also come with _____?
nothing else
Henry murrys psychogenic and viscernegentic needs _____?
Correspond to hulls secondary and primary drives, respectivly
Hull believed that habits are formed____?
through the action of the law of effect
For hull the most convincing kind of explanation is one that can be _____?
expressed in the workings of a machine
A fractional anticipatory goal response is most firmly connected to _____?
the goal stimulus
Conceptually there is little difference between unhappiness and _____?
the electron
For hull reinforcment always involves___?
reduction of a need
Hulls opinion on innate reflexive behaviors was that____?
they are adaptive
According to hull, adressing envelops or doing pushups causes a gradual increase in _____?
Reactive inhibition
As we reread a passeage of poetry in order to memorize it, we add to what is already learned ____?
In progressively smaller increments
When several response tendencies compete, the results is usually___?
dominance by the one with the higher momentary excitatory reaction potential
Hulls emphasis on the responses of muscles and glans and his interest in models in the form of machines shows that he was a____?
Molar theorist
In Hulls model of foresight, the world sequence is neccesary at first, ____?
but than is superfious
Hulls term that referred to a negative habit___?
Conditioned inhibition
As training continues, the quantitiy (M-SHr) __?
gradually decreases
We turn away from walking into a bears den. This kind of foresign of probable consequences is____?
probably the effect of chaining
According to pragmatic philosophy, what we do to really know a large room or a hammer illustrates____?
that knowledge is largely a part of our reactions
Thirst is an example of ____?
a typical intervening variable
Hulls Monday night group used the term reciprical habit complex to refer to___?
Behavior of a psychologist and patient
Interestingly the body doe not initiate corrective action in response to ____?
Oxygen deprivation
In hulls initiatial system____?
The amount of need reducing occuring on a given trial directly influence the size of the increase in habit strength that takes place on a given trial
Hulls approach to learning theory and research is called____?
Hypothetico-deductive methods
Hulls of evidence testing his corollaries and theorems showed that___?
most were found to be valid
Visuals and other afterimages are___?
included in hulls early postulates
The dissipation of inhibition overtime was used by hull to explain___?
reminiscence
Anticipation was explained by hull as___?
the occurrence of a fractional goal response
In Hulls system one begins with clearly stated fundamental principles called___?
Postulates
A bodily condition, left unattended, is lifethreatening is called_____?
A primary need
For hull knowledge may depend a large part on___?
muscle spindles and golgi tendon organs
Hulls concept of drive stimulus renders his view of motivation most similar to that of ____?
Guthries
If we accelerate as we approach a goal, Hull would suggest that it is due to ____?
Sensory feedback from functional goal responses
Sometimes a football team has the skill to win, but is not up for the game. This illustrates the principles in the experiments of _____?
Penn and williams
unconscious inference
automatic perceptual processes dependent on cues and learning; Helmholtz argued that we form expectancies about the commonplace world that we have no idea this is occurring; we see objects in the distance and it seems that we immediately "see" the distance; actually, we are relying on cues comin gtom eye muscles and from our past experience of distant objects to guide us; however, such sensations become so habitual that we do not notice them
structuralism
a school of psychology about the organization of the mind; Titchener; believed that the first priority was the analysis of the structure of consciousness through the method of introspection
mental chemistry
the notion that "ideas" combine to create new mental compounds; name given by John Stuart Mill to his theory of association; Mill believed that ideas combine as do chemicals, so that the whole is not simply the sum of the parts
functionalism
a school of psychology; stresses adaptive activity; one of the schools of psychology prevalent at the turn of the century; the functionalists stresses adaptive activity rather than the analysis of consciousness into static elements, as practiced by structuralists
similarity
the basic principle of associationism involving shared elements; used by empiricists; is often defined in terms of shared elements; thus, an apple is more similar to an orange than to an automobile; in most cases, we are unable to explain perceived similarity
epistemology
the study of the nature and origin of knowledge; the study of nature and origin of knowledge; the history of psychology is linked with the history of epistemology
statics
mind seen as set of "mental objects," i.e., images, thoughts, etc.; as used here, the view that psychological phenomena are best cast in terms of things, such as images, ideas, and motives, rather than in terms of activities
hypothetical construct
a "made up" thing, place, or process to explain a real phenomenon
parsimony
the fewer the assumptions, the better the theory
rationalism
the belief in innate knowledge or powers of reasoning; the belief that we are born with innate knowledge or powers of reason that allow us to recognize
Occam's Razor
explanatory entities should be kept to a minimum
dynamics
activities and processes; the opposite of statics
successive associations
Mill's term for a serial sequence of associated elements; sensations and ideas that are connected because they occurred previously at the same time; such things are often objects, since an object provides a set of sensations simultaneously
associationism
mind as network of connected ideas, sensations, etc.; a view that stresses the analysis of experience into elements (sensations, ideas, and so on) and proposes laws of association to account for the ways in which these elements are joined; in extreme form (e.g., Hume), this view holds that all of our experience and knowledge consists only of associated elements
anthropomorphism
attributing human characteristics to things and animals
synchronous associations
Mill's term for an amalgam of simultaneously associated elements
empiricism
the doctrine that all knowledge comes from experience; the doctrine that holds that all knowledge comes from the senses - that we are born with senses but with no innate ideas
sensation
the experience or quality produced by a sensory nerve; the experience or quality produced by a sensory nerve; some psychologists, including Titchener, consider the basic irreducible or analyzable qualities of sensation to be the basic elements of experience
Morgan's Canon
psychological explanations should be as simple as possible
stimulus error
a term coined by Titchener to refer to faulty introspection; if a subject describes experience in terms of what he or she believes to be the stimulus for the experience, that person has committed the stimulus error; for example, we may describe a table seen from an angle as rectangular, although the sensation we experience shows it to be trapezoidal
act psychology
Brentano: mind conceived as a "continuous flow of activity," not as a "thing"; name given to the view of Franz Brentano, who opposed Wundt's analysis of consciousness; Brentano argued that there is really no static "content" of consciousness; experience is made up of activities, not sensations and ideas
introspection
looking inward to examine one's consciousness; name given to the practice of examining one's consciousness
phenomenology
what one takes to be the nature of one's own experience; For Titchener this was the sensation, image, and affect; for the Gestalts it was the structured whole; also has a more technical meaning, referring to philosophies that stress immediate experience and that reject the representative theory of perception
intervening variable
a summary word or term for a lawful cause-effect relationship
behaviorism
subject matter of psychology is activity, "what people do," etc.; a view that holds that the subject matter of psychology may best be viewed as activity (behavior), rather than as "things" or structures; according to behaviorists, we do not have thoughts or images, but we do think and imagine; further, behavior is considered as significant in itself; a behaviorist does not treat what we do as the result of some underlying mechanisms, such as biological centers or repressed wishes
Herbert Spencer
known for espousing "evolutionary associationism"
Hermann von Helmholtz
proposed a theory of unconscious inference
Kenneth MacCorquodale
with Meehl, wrote a classic critique of psychological constructs
Conwy Lloyd-Morgan
proposed a "canon" advocating conservative interpretations
Paul Meehl
with MacCorquodale, wrote a classic critique of psychological constructs
George J. Romanes
analyzed animals "minds" via introspection by analogy
Edward Titchener
headed the structuralist school of psychology
Charles Darwin
articulated the theory of evolution
Wilhelm Wundt
founded the first psychology laboratory, Leipzig 1879
James Mill
the first show how, in principle, associationism explains "mind"
Franz Brentano
originated act psychology
William James
important 19th century Harvard philosopher and psychologist
John Dewey
pragmatist educator; opposed analysis into static elements
James Angell
spelled out main tenets of "functionalism"
assimilation
this refers to the influence of a mass of associations on one of its member elements and vice versa; for example, the word "cold" brings to mind snow, quiet, blankets, and so on; this ability of one element (cold) to call up a whole complex is an instance of assimilation; similarly, presentation of the mass, such as in a picture of a wintry scene, may influence a present thought, by making us shiver; the law of assimilation, a principle of association used by Wundt and may others, is also known as redintegration
beschreibung
name given to the method of studying conscious experience used by Wundt; carefully trained observers verbally report their "content" of consciousness when presented with simple forms of stimulation; the same method was used by many other early psychologists, although it dropped from favor after Kulpe showed its grave limitations
complication
term used by many early psychologists to refer to compound associations made of elements from different senses; a cold, wet stone is an instance of complication, including visual, tactile, and thermal elements
context theory
explanation of perception and meaning proposed by Berkeley and adopted by others - notably Titchener, in his core/context theory; percepts are reducible to a context of elements; distance is a combination of sensations from the eye muscles and the limbs; meaning is no more than the context of associated ideas connected to a core
dynamics
activities and processes (as opposed to statics, which are images, ideas, motives, and son on)
einstellung
the effect of preparation, expectancy, or "set" on the performance of mental and manual activities; was first stressed by Kulpe and the Wurzburg School
fusion
Wundt's law of association referring to compounds not analyzable to elements
psychic resultants
1 of 3 main forms of order in mental life proposed by Wundt; are those forms of association in which the sum of the elements is less than the whole; the laws of fusion, assimilation, and complication are examples
multiple response
Thorndike's principle that, at first, many responses may occur; one of Thorndike's subsidiary laws; refers to the behaviors that an individual brings to a learning situation and that determine what behaviors will occur
law of effect
Thorndike's law on how consequences affect S-R connections; Thorndike's doctrine that held that responses were connected to or disconnected from situations depending upon the effect produced by the response
discipline method
drilling on material to develop stronger mental faculty; method of education in which drill exercises are used to develop a general mental faculty; for example, the memorizing of poetry was used to develop the general faculty of memory
annoyer
Thorndike's term for effects that weaken S-R connections; Thorndike's term for a punisher; a state of affairs that stamps out the associations between a situation and a response; in Thorndike's pre-1930 theory, a response followed by this will be less likely to occur in that situation in the future
connectionism
Thorndike's term for his theory of learning; learning consists of the connecting of stimuli and responses
associative shifting
Thorndike's term for S-R learning via contiguity alone; one of Thorndike's subsidiary laws corresponding to what was later called classical conditioning
mentalism
the belief that nonphysical mind events cause physical events; the belief that mental phenomena are different in kind from physical phenomena and that mental events may cause physical events
stamping in or out
term used by Thorndike to describe the automatic connecting (or disconnecting) or S-R bonds by effects
original behavior series
Thorndike: an innate sequence of responses that "satisfies"; Thorndike's term for a sequence of activities that are determined innately and that constitute satisfaction when completed
law of exercise
Thorndike's law on how mere activity affects S-R connections; Thorndike's pre-1929 principle that referred to the connecting of a response and a situation simply because they frequently occurred together
hedonism
the view that pleasure and pain govern behavior; the ancient doctrine popularized in British empiricism, which assumes that pleasure and pain are essential determinants of conduct

a classic view that behavior is governed by pleasure and pain; the doctrine that our conduct is regulated largely by pleasure and pain
mechanism
explanations that do not appeal to supernatural forces or agents; assumption that explanations must not refer to outside agents, such as demons or life forces
selecting and connecting
only the Rs that produce satisfiers become bonded to Ss; term used by Thorndike to describe his theory; responses are selected by the fact that they produce satisfiers, which also connect them to the situation in which they occurred
satisfier
Thorndike: a state of affairs that connects preceding S-R; a state of affairs that acts to connect a response with the situation in which it occurred and thus form and S-R bond; any more precise definition of a satisfier becomes difficult
Rene Descartes
said "I think, therefore I am"
learning sets
Harlow's term for "learning to learn"
negative law of effect
the part of Thorndike's theory about the effects of annoyers; the portion of the pre-1929 law of effect that refers to the effect of annoyers, or punishers
piecemeal activity
Thorndike: responding is only to "pieces" of the whole situation; one of Thorndike's subsidiary laws, which referred to the selective nature of perception; we react only to a small subset of the elements of a situation
response by analogy
Thorndike's principle of "transfer" of learning to new situations; Thorndike's principle of transfer; we respond to a new situation in the same way that we responded to similar situations in the past
Edward L. Thorndike
originated first Stimulus-Response (S-R) reinforcement theory
conduction unit
Thorndike's theoretical neural mechanism of Law of Effect; used by Thorndike to illustrate law of readiness; depending upon its readiness to fire, the firing or not firing of this constitutes satisfaction or annoyance
Harry Harlow
UW prof; demonstrated "learning sets" and "insight" in monkeys
Wolfgang Kohler
pioneer in Gestalt psychology; stressed role of insight in learning
Law of readiness
Thorndike's law on what enables reinforcers and punishers; also called the law of instinct; Thorndike's law refers to the conditions that determine satisfaction and annoyance; present stimuli produce a readiness for certain types of consequences
successful operation
Thorndike: the unimpeded completion of an original behavior series; term used by Thorndike to specify the conditions for satisfaction; refers to the unimpeded completion of an original behavior series
faculties
mental powers that come into play when the mind does work; mental powers that act semiautonomously; attention, memory, judgement, perception, and imagery are often proposed as these
attitudes/dispositions/preadjustments/sets
four terms treated as synonymous and referring to the effect of preparation of the learner on the effectiveness of satisfiers and annoyers, as weel as on other effects of a new situation on performance
insight
term used by Kohler to describe the sudden solutions to problems he observed in his primate subjects; insight involves the apprehension of relationships in a problem situation, which Kohler contrasted with "blind fumbling" and the action of the law of effect; this was the view of Thorndike's theory; readers of Kohler's accounts of insight in apes may disagree with his interpretations; like other Gestalt psychologists, Kohler down played the effects of experience in promoting insight; in 1959 he said that, although he had been confused on the issue in the past, he realized that only sudden solutions without prior experience represented true instances of insight; thus defined, insight may be a relatively rare event
problem box
a crude device used by Thorndike and other animal researchers to study problem solving; typically, a subject placed in such a box could escape by operating a release mechanism, such as a lever or a pull cord
aversive
category of UCSs; things harmful or not beneficial to learner; this adjective describes stimuli that we judge to be unpleasant, dangerous, or otherwise not beneficial for the organism
conditioned suppression
reduced instrumental response rate in presence of a Pavlovian CS; procedure used to investigate the inhibitory and/or aversive properties of stimuli; for example, a stimulus previously paired with shock may be presented while subjects are bar pressing or proof reading; if the rate of the latter performances is decreased during the presentation, the added stimulus is judged inhibitory
extinction below zero
reduced spontaneous recovery following greatly extended extinction
Ivan Petrovich Pavlov
first Russian Nobel laureate; originated important conditioning method
excitatory conditioning
CS evokes CR activity because of previous CS-UCS pairings; classical conditioning in which a CS reliably predicts the occurrence of a UCS; this is the opposite of inhibitory conditioning, in which the appearance of the CS means that the UCS is not coming
algebraic summation
combining excitatory and inhibitory CSs to assess CR magnitude; method used by Pavlov to demonstrate the inhibitory and excitatory properties of CSs by pairing them with other CSs; For example, an inhibitory CS introduced along with another CS will cause a decrease in responding to the latter; a positive CS presented with another positive CS should produce a response greater than that to either of the individual stimuli; this method also was used by Rescorla
Sir Charles Sherrington
English physiologist and Nobel laureate; did early studies of spinal reflexes
classical conditioning
Pavlov's training procedure; learning via pairing of stimuli; procedure whereby 2 elicitors are paired in such a way that the weaker precedes the stronger by a second or more (ideally) and so that the weaker stimulus reliably predicts the stronger; conditioning is said to occur when the weaker stimulus produces a response similar to that produced by the stronger; for example, the taste of coffee, associated, with the effects of caffeine, may actually perk us up
delayed conditioning
standard Pavlovian procedure; CS onset precedes UCS onset; classical conditioning procedure in which the CS is presented, remains present, and is later followed by the UCS; with training, responding decreases during the early part of the delay interval
extinction
CS alone; disappearance of CR when CS is no longer followed by UCS; the decrease and eventual disappearance of a CR, which happens when the CS is repeatably presented without a UCS; Pavlov believed that this brought about the inhibition of the CR
differentiation
different CRs to different CSs due to CSs paired with difference UCSs; Pavlov's term for discrimination formation; it includes both the discriminating of stimuli and the associating of UCSs with appropriate stimuli
acquisition
pairing of CS with UCS; development of CR to CS; the development of an ever-stronger and more reliable conditioned response (CR) through conditioning
disinhibition
recovery of inhibited response when a novel stimulus occurs; the release from inhibition produced by a new stimulus, such as a hand clap or a trumpet blare; this is evidenced by the appearance of responding in the presence of an inhibitory
Robert Rescorla
prominent contemporary "association learning" researcher of U of Penn
external inhibition
disruption of CR performance by occurrence of novel event; suppression of a CR produced by introduction of a stimulus that produces a competing response; for example, conditioned salivation by a dog may be disrupted if the dog's name is suddenly called in a loud voice
concentration
Theoretical Pavlovian process; highly focused cerebral activity; process that occurs during classical conditioning, specifically during discrimination learning; early and late in training, excitation and inhibition associated with CS+ and CS- irradiate (spread); midway in training, excitation and inhibition concentrate, or remain close to their respective brain centers; is the opposite of irradiation
backward conditioning
reverse of delayed conditioning; UCS first, then CS; a conditioning procedure in which the UCS precedes the CS; this was formerly thought to lead to no conditioning, but it has been recognized as the chief means for producing inhibitory conditioning
inhibition
one of Pavlov's fundamental brain processes; reduced activity; basic neural process which Pavlov believed worked with excitation to regulate the workings of the brain; For Pavlov, inhibition meant the suppression of neural activity, directly opposing the activation produced by excitation; it is known that excitation and this are characteristics of neural activity
appetitive
category of UCSs; things that are beneficial to learner; this adjective describes stimuli that we judge to be pleasant, adaptive, or otherwise beneficial for the organism
experimental neurosis
bizarre behavior brought on by insoluable Pavlovian differentiation; bizarre behavior brought on by insoluable problem after experience with similar problems that were soluble; Pavlov believed that disruption of normal excitation and inhibition in the brain accounted for experimental neurosis
conditioned stimulus (CS)
originally neutral stimulus that comes to evoke CR; weak elicitor that is paired with a stronger elicitor in such a way as to acquire the power to evoke the response originally evoked by the stronger elicitor (UCS)
conditioned response (CR)
the "new" response to CS in Pavlovian conditioning; response evoked by a CS after pairing in a specific way with a UCS; the CR resembles the response to the UCS, not the original response to the CS; for example, the sight of a cut onion may produce a CR secretion of tears
unconditioned reflex
natural, unlearned UCS-UCR combinations; inborn reactions to stimuli; reflex behaviors that do not depend on the conditions of our experience; unconditioned reflexes are those reactive behaviors with which we are born
second-order conditioning
a CR and a CS due to its having been paired with a previously trained CS
inhibition of delay
the conspicuous absence of CR during the early part of a long-duration CS; a decrease in responding that occurs during the early part of the delay period during delayed conditioning; Pavlov believed that this was due to the fact that time acted as a CS, and, since that time just after the onset of the CS was never accompanied by the UCS, it became inhibitory; a hand clap could restore responding during this period, showing the disinhibition of inhibition of delay
semantic generalization
generalization of a CR to different-sounding words that mean the same
instrumental conditioning
Thorndike's procedure; learner's behavior produces result or "effect"; learning that depends on the consequences of responses; the procedure is similar in some respects to classical conditioning; also known as operant learning
truly random control
a comparison group suggested by Rescorla in 1967 as the only appropriate control procedure for classical conditioning; CS and UCS occur randomly; this procedure also assumes a definition of conditioning as a contingent relation between the CS and the UCS; the CS must act as a reliable predictor; the control procedure defines the absence of classical conditioning by presenting the CS and the UCS randomly, so that occurrence of the CS may mean that the UCS is coming or that it is not; the CS predicts nothing
pseudoconditioning
a "CR-like" response to a CS, but not due to pairing of CS with UCS; apparent CRs produced because the UCS has rendered the subject overly reactive to stimuli in general; this occurs especially with strong, noxious stimuli, such as strong electric shock; thus, if the CR is a change in heart rate and the UCS is powerful shock, a variety of stimuli other than the CS may produce heart rate change, and responses that occur to the CS therefore cannot be certified as true CRs
psychosomatic illness
bodily illness due to psychological learning rather than physical causes (germ, etc.); term referring to bodily illness produced by the mind; may be as fatal as other illnesses and are almost surely due to classical conditioning, at least in large part; because the distinction between mind and body is o questionable worth, the term psychosomatic may be obsolete
spontaneous recovery
reappearance of extinguished CR to CS after a period of no further CSs; name given to the reappearance of a CR that had been extinguished earlier; since the effect occurred only after a period of time had passed since extinction, Pavlov suggested that this was evidence for the accumulation of inhibition during extinction and its dissipation
omission procedure
if CR occurs to CS, UCS is not presented; test for "Law of Effect"; experimental procedure in which a CS is presented and a UCS thereafter, unless a CR occurred to the CS; thus, the sight of water might be followed by a drink of water, as long as no salvation occurred when the sight of water appeared Omission training was first used to determine whether some CRs are affected by their consequences; if a CR occurs whether it prevents a UCS or not then it is a true CR
semantic conditioning
Pavlovian conditioning to the "meaning" of a word as the CS
Pavlovian induction
effects produced by a CS on response to a subsequent CS; effects produced by a CS on responding during a subsequent CS; for example, positive induction occurs when a CS- precedes a CS+ and the response to the CS+ is stronger than when CS- does not precede it; negative induction occurs when responding to a CS- is suppressed by an immediately preceding CS+; Pavlov believed that induction worked together with irradiation and concentration during discrimination learning
inhibitory conditioning
CS opposes CR because CS has previously marked "no UCS" periods; classical conditioning procedure in which a CS reliably signals the absence of a UCS; this is the opposite of excitatory conditioning
simultaneous conditioning
pairing of CS and UCS by turning them on and off together; classical conditioning procedure in which the CS slightly precedes and overlaps the UCS; they occur almost simultaneously; however, the CS must precede the UCS slightly
interoceptive conditioning
Pavlovian conditioning by stimuli occurring inside learner's body; classical conditioning in which the CS, the UCS, or both are applied within the body, to an organ or a system within the body
phonetic generalization
generalization of CR to words that "rhyme" with the original CS word
sensory preconditioning
CR to CS2 due to CS2 paired with CS1 before conditioning of CR to CS1; procedure in which two stimuli are presented together and one of them is later paired with a UCS; if the other stimulus also produces a CR, this has occurred; for example, a light and a tone may be paired a number of times, followed by the pairing of the tone with electric shock; if the presentation of the light produced a shock response, then this took place
S-R association
a hypothetical construct positing a stimulus connected to a response
S-S association
a hypothetical construct positing a stimulus connected to another stimulus
suppression ratio
B/(A+B); used to quantify the extent of "conditioned suppression" to CS
trace conditioning
a conditioning method in which CS precedes UCS, but goes off before UCS; classical conditioning procedure in which a CS is presented, taken away, and later followed by the UCS; CRs that appear are assumed to be caused by a connection between a memory trace of the CS and the UCS
semantic generalization
generalization of a CR to different-sounding words that mean the same
sensitization
a general state of heightened responsivity to all stimuli; CRs that depnds upon factors other than the specific pairings of CS and UCS; for example, a CS light could evoke eye blink responses independent of the UCS air puff; additionally, the occasional presentation of air puffs could make such blinks more likely
irradiation
theoretical Pavlovian process; spread of local activity across brain; spread of excitation and inhibition, which Pavlov supposed to occur on the surface of the cortex; during the formation of a differentiation, this occurs early and late in training
alimentary center
brain center aroused when eating occurs, in Pavlov's theory of conditioning; this was strongly activated by food in the mouth (a UCS), and part of this activation apread to the brain centers corresponding to CSs simultaneously present
equalization phase
stage described by Pavlov that occurs during prolonged extinction training; during this phase, both strong and weak CSs produce the same magnitude CR
eyelid (eye-blink) conditioning
classical conditioning method popular in the US; a UCS air puff to the cornea produces a UCR eye blink; a CS (such as a tone) preceding the air puff produces a CR eye blink; it is less messy than conditioned salivation, and it has been extended to the blink of the nictitating membrane in rabbits
final common path
Sherrington's term for the motor path in the spinal reflex system; there are great many more afferent (sensory) nerves than there are motor outlets, and he described a competition among sensory nerves for the motor outlet, or what this is called
integrative action
Sherrington's term for the coordinated activities of individual organs, with the spinal cord acting as an organ of integration; the integration system as a whole and opposes the analysis of discrete units (for example, reflexes)
neural unit
basic element of sensory experience first proposed by Mach in the last century and adopted by a host of workers in sensory physiology during this century; consists of an excitatory center surrounded by a zone of inhibition
paradoxical phase
stage occurring during prolonged extinction in which weak CSs produce stronger responses than do strong CSs
reciprocal inhibition
term used by Sherringtion to refer to the inhibition of an antagonist muscle by a contracting muscle; when the biceps contract, the triceps are inhibited
reflex
a reaction produced by specific stimulation; one example of an innate reflex is the constriction of the pupil produced by light in the eye

for Skinner refers to order; the discovery of this is the discovery of an orderly relationship between the world and behavior; Skinner extended a masterly analysis of the concept of this to conditioned reflexes and operants in the 1930s
Sherrington
English physiologist who won the Nobel Prize in 1932 for his earlier analysis of the spinal nervous system in 1906; is credited with firmly establishing the reflex as the basic unit of physiology, but he chose to emphasize the workings of the nervous system as a whole, called the reflex "a convenient fiction"
simultaneous induction
name given by Sherrington to the fact that the stimulation of a sensory surface may produce a decrease in threshold in adjacent units which would lead to the same response; it was this usage which Skinner later meant when he referred to induction as generalization
successive induction
term used by Sherringtion to describe the increased responsiveness of a muscle group following release from inhibition; for example, id we flex our biceps strongly for a time and then relax it, we may find out arm straightening more quickly than we expected, due to the strong contraction of the triceps; this is the sense in which Pavlov used "induction," that is, as a successive effect, of "opposite sign"; thus, excitation produces subsequent inhibition and inhibition leads to later excitation
ultraparadoxical phase
the stage of prolonged extinction in which positive CSs produce no reactions and negative CSs do produce CRs
frequency
a repeated occurrence; a factor that Watson believed determined behavior
instinct
species-specific behavior that is present at birth or upon maturation; older views of instinct treated it as indenpendt of learning, and much research was done to separate instinctive from learned behavior; usually was not applied to obvious reflexive behavior, such as occurs in the process of digestion or in the blinking of the eye when a speck of dust strikes it; depending upon the theorist, this could refer to the nest building of birds and the web spinning of spiders or to the performance of children on so-called intelligence tests and the customs of ethnic groups; current opinion holds that crass division of behavior into instinctive and learned is unwise; in Watson's times it was common to invoke this as the explanation for differences in the behavior of groups of individuals, whether human or animal
visceral habits
Watson's terms for learned responses constituting "emotions"; Watson's term for learned behaviors involving the viscera, or internal organs; for example, changes in heart rate, perspiration, and gastric activity may occur when we are frightened by a ghost story; this emotional reaction is due to prior learning and thus represents this; our reaction to the pledge of allegiance, the phrases "up against the wall" and "good job" likewise include visceral components; visceral, or emotional reactions are part and parcel of most of our reactions, along with the manual and laryngeal components
Rosalie Rayner
J.B. Watson's spouse and early research collaborator
laryngeal habits
Watson's term for learned behaviors that serve a language function; Watson's term for learned behavior that constitutes the use of language; this includes not only spoken words, but any behavior that acts as a substitute for action or as a symbol for an object; have a communicative function and include, for example, a shoulder shrug
kinesthesia
sensory effects produced by muscular movements; when a muscle contracts, muscle spindles produce neural impulses that aid in the coordination of muscular movement; other feedback occurs when a limb is moved, due to the action of Golgi tendon organs; without information from muscle spindles and Golgi tendon organs, coordinated movement is difficult
emotional flooding
method of psychotherapy that presents a feared circumstance "close up"; method of psychotherapy in which a feared object is repeatedly presented at close range to the patient, who may react quite violently; such treatment has been reported to sometimes alleviate unreasonable fears (phobias)
behaviorism
view popularized by J.B. Watson; emphasis on behavior in psychology; the view that the subject matter of psychology is the study of behavior, which usually includes private behavior, such as seeing and dreaming; behaviorists do not use explanations that refer to hypothetical inner mechanisms, such as mental powers, or supposed neural events
recency
the last response to a situation; considered an important factor by Watson; the influence on behavior produced by the most recent experience in a situation; for example, the repetition of an error made on the preceding trial could be viewed as a recency effect
Mary Cover Jones
did earliest work (with Watson) on treating learned fears in children
Zing Yang Kuo
strongly criticized the concept of "instinct"; suggested learning as alternative
manual habits
Watson's term for learned skeletal muscle behaviors, usually overt (observable); threading a needle is an example
Edwin Holt
noted that knowing "how" something works makes "why" less important
functionalism
a "school" of psychology that minimizes structure and emphasizes function; movement in psychology that opposed the emphasis on the analysis of the structure of consciousness; functionalists stressed the importance of activity, especially adaptive activity; Watson opposed this movement because, despite its emphasis on behavior, it still stressed the importance of consciousness, which he felt was counterproductive
John Broadus Watson
renegade genius of pre-WWI era who coined and promoted "behaviorism"
generic mentalist view
Consciousness is: mental, central, profound, primary, the cause, fascinating, monumental

behavior is: behavioral, peripheral, superficial, secondary, its effect, mundane, incidental
generic behaviorist view
behavior is: what organisms do, overt or covert, public or private, verbal or nonverbal, a source of stimuli, often functional, often nonfunctional

consciousness is: a form of behavior, mostly covert, mostly private, mostly verbal, a source of stimuli, often functional, often nonfunctional
stereotypy
precise repetition of the exact movements in a behavior; for Guthrie, the repetition of a movement or a series of movements in precisely the form that occurred when the individual was last in the same situation; much of our comings and goings, our thoughts and our moods, occur in a stereotyped manner; is what Guthrie and Horton found in their famous experiments with cats in the puzzle box and it is the only major principle of Guthrie's theory
toleration
gradually fading in bad-habit cues while the learner is behaving well; one of Guthrie's major methods for changing habits; cues that ordinarily produce unwanted reactions are presented in graded steps while another activity is occurring; if done properly, the unwanted reaction never occurs, since it is replaced by the competing behavior; for example, we may use the method of toleration to train a shy speaker to speak easily before large groups, by first exposing the speaker to an imaginary group, while relaxed, and then introducing an audience one by one; if we maintain the speaker's relaxation through this process, we may finish with him or her facing a large audience with no qualms; graded presentation of feared stimuli in this fashion is typical of Wolpe's desensitization procedure
maintaining stimuli
Guthrie's concept for motivation or drive; Guthrie's term for stimuli that lead us to action which persists until these disturbers are removed; others called such stimuli drives, referring to hunger, thirst, and so on; but Guthrie included other motivating stimuli, such as those produced when we are asked a question that must be answered
stimulus sampling theory
Estes' mathematical model of learning based on Guthrie's theory
exhaustion
one of Guthrie's methods for eliminating habits; like "flooding"; one of Guthrie's methods for changing habits; the behavior to be eliminated is repeatedly evoked in the presence of the cues that usually accompany it; eventually, the behavior is "exhausted" and whatever behavior then occurs replaces it; for example, an individual with a strong fear of automobile travel might continuously ride in a car until the fearful reactions subside, or a smoker might smoke cigarette after cigarette until the sight and smell of cigarettes produces a new reaction; emotional flooding is the name currently given to this method; here, it is ordinarily a phobic stimulus, such as a spider, or the description of other emotion-producing situations that is repeatedly presented until the strong reaction usually produced fades
motive
Guthrie: a set of maintaining stimuli that instigate behavioral action; for Guthrie, an instigator to action; hunger, a question, a shiny object, and a challenge are all examples of this; may be viewed as a set of maintaining stimuli, which produce action until removed
sidetracking
Guthrie: changing a habit by altering its terminal components; a method of changing habits, which consists of beginning the movements that constitute the habit, followed by movements incompatible with the habit; one may counter a cookie-eating habit by raising a cookie to the mouth and then throwing it away; this is not a very good method, since the habit to be broken remains intact, with only the initial movements altered; Guthrie stressed his other methods, which amount to dismantling the habit
overcorrection
changing a habit by repeatedly practicing the correct action in situ; changing a bad habit by repeatedly practicing the correct behavior in the situation in which we wish it to occur; for example, when a mistake is made, we stop and repeatedly perform the correct behavior
incompatible stimuli
one of Guthrie's methods for eliminating habits; "counter-conditioning"; Guthrie's method of changing behavior by introducing stimuli that produce a reaction incompatible with the reaction to another set of stimuli; for example, one might treat cigarette smoking by encouraging the patient to eat apples when the urge to smoke arises; it is difficult, though not impossible, to simultaneously eat apples and smoke; in another example, strong anxiety reactions to feared situations could be countered by muscle relaxation training; in both cases, the aim is to somehow produce a desired reaction (nonsmoking or relaxation) in the presence of cues that usually lead to the undesired reaction (smoking or anxiety)
W.K. Estes
U of M alumnus who mathematicized Guthrie's theory
Edwin R. Guthrie
originated the all-or-none S-R association-through-contiguity theory
contiguity
nearness in time or space; in learning, events going together in time; nearness; spatial and temporal contiguity refer to closeness of events in space and time, respectively; Guthrie's theory emphasizes the importance of temporal contiguity of stimuli and responses

a pairing in time between two stimuli or between a behavior and an effect; closeness in space and/or time; Pavlovian conditioning was formerly thought to depend only on the simple contiguity of CS and UCS in time
negative practice
deliberately practicing a bad habit to discover what cues produce it; conquering a bad habit by repeatedly performing it in the presence of the cues that normally produce it; for example, we might practice a bad tennis backhand so that we notice the cues that produce it; then we can practice the correct movements in the presence of those cues
redintegration
calling up an associative complex by one or more constituent elements; the calling up of a stimulus complex by one or more members of the complex; for example, the smell of roast beef may bring to mind the sight and taste of the roast beef present when last we experienced the smell; this principle of is really the essence of Guthrie's theory, and he proposed that this was a better term to describe his theory than was conditioning
behavior therapy
method of dealing with psychological problems by treating them essentially as behavior problems; for example, anxiety may be a serious problem for an individual; a behavior therapist would seek the conditions that now produce anxiety, such as specific situations or relations with other people; therapy would aim to eliminate the anxiety to whatever now produces it, perhaps through desensitization; other therapies, especially the older ones such as psychoanalysis, are less concerned with the current causes of problems and treat afflictions such as anxiety as symptoms of some underlying psychic disturbance; they attempt to treat this "underlying cause," a process that can take years and is often unsuccessful
British empiricism
school of British philosophy represented by Locke, Berkeley, Hume, and later by James Mill and his son John Stuart Mill; most British empiricists stressed the importance of temporal contiguity of stimuli and responses
reward
for Guthrie, a special case of association by contiguity; is simply a change in stimulus that acts to preserve the association between a set of stimuli and whatever actions last occurred in the presence of stimuli; if the change produces strong new reactions that are incompatible with the last actions in the presence of the old cues, the effect is punishment
species-specific defense reactions
name given by Robert Bolles (1970) to the reactions naturally called out by danger; Bolles suggested that avoidance learning is aided when the response is one of the reactions that normally occur when a member of the species in question is threatened; this, it is easy to train a rat to avoid shock when the avoidance response is running (an SSDR), but it is difficult when the rat must press a lever to avoid shock; lever pressing, which involves manipulation with the forelimbs, normally occurs in eating, not in the reaction to danger
Clark L. Hull
Yale professor who applied hypothetico-deductive method to learning theory
dependent variable
the objective measurement of the behavioral effect of a manipulation; that which we observe or feel as a change in something as the result of some manipulation; the data of most experiments are this kind of variable; they are the effects that we wish to predict; for example, a period without food produces changes in these such as reported hunger an in the amount of food eaten once it is made available
drive
Hull's term for all the factors that motivate or energize behavior; D; Hull believed that learning per se is not sufficient to produce action; drive is necessary to energize learning; "a demand for or against a given type of goal object or situation" plus a vague readiness as to the proper way to get to that goal object

the chief way of treating motivation in many psychological theories, especially Hull's; biological imbalances, or needs, lead to behavior aimed at reducing the need; a need acting to produce behavior is termed to be this
David McClelland
psychologist best known for his work assessing nAch (achievement need)
aversive drive
stimulation that leads to activity to escape or avoid it; from Hull; this commonly refers to electric shock and other things that produce pain; hunger, which may be painful, is referred to as an appetitive drive, since the object that reduces the drive is viewed as pleasure producing, rather than as pain ending; Hull and his followers used the appetitive/aversive distinction, but it is surely vague and has been criticized widely; for example, when we do something to escape the cold, are we driven by the aversive drive of cold, or the appetitive drive for warmth?
fractional antedating goal response
a partial goal response conditioned to stimuli that precede reinforcement; also called the factional anticipatory goal response; this is a partial goal response that may occur well in advance of the meeting with the goal stimulus; when someone is anxious to eat, fight, dance, or whatever, he or she is already making some of the responses that will occur when the goal activity begins; for Hull, the way it feels to make such anticipatory responses is the physical basis for what we call anticipation; symbolized as rG - sG
frustration
an emotional reaction elicited when the anticipated reward does not occur; Spence's term for the reaction produced when no reward is received in situations in which it has been anticipated; Spence felt that this was the basis for inhibition; Amsel, Wagner, and others developed a theory in which fractional anticipatory frustration comes to act as an added motivator
automaton
a "behaving machine" whose parts and processes are totally understood; a machine, or automatic device, built from parts that are known and the actions of which are understood; Hull suggested that we treat ourselves and animals as automata in order to avoid explanations for our behavior that rely on agents whose properties we do not understand; thus, purpose and knowledge are understood only when we have built a machine that shows purpose and knowledge
Neal Miller
Hull's student; authored "Social Learning & Imitation" (1941) with Dollard
conditioned inhibition
Hull's notion in which a learner learns the "habit" of not responding; a negative habit, or tendency not to do whatever response was last done in the presence of the stimuli involved; like positive habits (sHr), this is reinforced by drive reduction; in this case, the drive reduced is reactive inhibition (Ir), which begins to dissipate when action slows or ceases
chain
a specific sequence of responses tied together by their consequences; a sequence of behavior that is linked together (as a chain) by the consequences of each response; these consequences may be in the form of movement-produced stimuli or stimuli from outside, as when we recite a poem and each word or line acts as a stimulus for the next response
Albert Bandura
influenced by Spence at Iowa, now known for his Social Learning Theory
generalization
the transfer of learned behavior to new situations similar to an old situation

stimulus generalization refers to the fact that we respond to a range of stimuli that are similar to stimuli to which we have learned to respond; for example, we may behave in the same way to policemen, even when they are dressed in uniforms unlike those to which we are accustomed

response generalization refers to a behavior learned in a specific way that may occur in similar, but not identical form; once we learn to button a sweater, for example, we may do so in somewhat different ways on every occasion that follows; Hull treated this as a case of generalization
Gordon Allport
Harvard personality theorist who emphasized functional autonomy
drive stimulus
Hull's conceptualization of biological drives in stimulus form (stomach contractions, and so on); if drives can be treated as stimuli, then they can enter into connections with responses, as do other stimuli
Kenneth W. Spence
Hull's prominent student and collaborator; long at U of Iowa
functional autonomy
the process of independence of motives originally dependent on basic needs
excitatory reaction potential
Hull's intervening variable representing the tendency to behave; E = H x D; symbolized by sEr; it is the product of sHr times drive, minus inhibitory factors; reaction potential may be measured in terms of latency of response, amplitude of response, and resistance to extinction
intervening variable
term denoting a lawful relationship between a set of empirical variables; term used to link the effects of several independent variables on several dependent variables; for example, suppose you are deprived of food (independent variable); the more you are deprived of food, the more you eat (dependent variable); you may attribute your behavior to hunger, an intervening variable
hypothetico-deductive method
Hull's approach to learning theory and research; a method for discovering new knowledge, which begins by clearly stating initial assumptions and then deduces reasonable theorems and corollaries that can be tested in some way; depending upon the outcome of tests, the derived principles are modified and, if necessary, the postulate is altered; proponents of this method feel that it provides an organized framework for conducting research; for example, a postulate could state that all stimuli present when a response is made and a drive is then reduced become capable in the future of evoking that response; derived theorems could say that (1) all stimuli, whether noticed or not, become capable of evoking the response, (2) some response must be made in order for the stimuli to become attached to it, and (3) drive reduction is essential for stimuli to become connected to responses; Hull was the only psychologist to explicitly use this method, although it is implicitly used by many scientists in all fields
postulate set
a set of basic assumptions from which theorems can be deduced; set of basic assumptions from which are drawn the theorems to be tested empirically; reflects what the author believes to be fairly well accepted principles concerning the phenomena under study

Hull's 1934 list of postulates was as follows:
1. the afferent trace
2. afferent neural interaction
3. unlearned S-R connections
4. learned habits (sHr)
5. stimulus generalization (s-Hr)
6. drive (D) and drive stimulus (Sd)
7. excitatory reaction potential (sEr)
8. reactive inhibition (Ir)
9. conditioned inhibition (sIr)
10. behavioral oscillation (sOr)
11. the reaction threshold (sLr)
12. probability of response (sEr - sLr)
13. latency (str) of a response as a function of sEr
14. resistance to extinction as a function of sEr
15. amplitude of response as a function of sEr
16. competing response: the response with the greater sEr prevails
reminiscence
a significant improvement in learned performance following a rest; improvement without practice; for Hull, this was the result of the fading of reactive inhibition; for example, when subjects are memorizing long lists of words, periods of no practice often improve performance, compared with the performance of other groups, in which the subjects continued practice without rests
postulate
a carefully stated assumption that guides thinking and theoretical research
S-R psychology
any theory attributing behavior to learned stimulus-response connections; psychological theories that hold that our behavior and experience may best be conceptualized as responses attached to stimuli in the world and in us; such theories often advocate the analysis of behavior and experience into S-R associations; the original S-R theorist was Thorndike, followed by Watson, Guthrie, and many others; the most extreme S-R view was that of Hull and his many followers; though widely criticized, this view has proven very durable and useful over the years, and a great many current psychologists believe that it is worth retaining
peak shift
most responding in stim generalization at a point away from S+ and S-
Perin/Williams experiments
classic studies by Hull's students supporting the claim that E = H x D; classic experiments done in the early 1940s that were used by Hull as evidence for the assumption that learning and motivation are two separate things and that learning and motivation multiply to produce performance
secondary reinforcement
reinforcement by an event that does not reduce a primary/biological drive; such things as receiving praise or acquiring money are often called secondary reinforcers
incentive motivation
the Hull/Spence intervening variable representing the size of a reward/refers to the effects of the amount and quality of the reinforcer; K
molar
Tolman (notably): a large, functional unit of stimulation and/or behavior; larger or encompassing more; in psychology, the molar/molecular distinction concerns whether a theory deals with larger or smaller units of stimulation and behavior; theory would refer to larger units; for example, molar theorists stress large units of behavior and experience extending over time, but organized according to the relevant goal involved; "going to the store" is a molar description of behavior, where the individual movements would constitute a molecular description
molecular
small constituent units of stimuli and behavior, e.g., specific movements; in psychology, the molar/molecular distinction concerns whether a theory deals with larger or smaller units of stimulation and behavior; theory may interpret experience in terms of discrete ideas or muscle movements; "going to the store" is a molar description of behavior, where the individual movements would constitute a molecular description
reactive inhibition
Hull's term for the tendency "not to respond" produced by responding; according to Hull's 8th postulate, muscular or mental activity during learning produces an ever-increasing aversive drive, (Ir); when activity ceases, this drive is reduced and a tendency to no repeat such activity results; this acts as the reinforcer that produce conditioned inhibition
matched-dependent behavior
Miller and Dollard's term for "imitation"; they emphasized the socail importance of imitating models who are older, of higher status, or more technically sophisticated
goal stimulus
Hull's term for reinforcing stimulus; elicits the "goal response"; symbolized by SG; such stimuli were assumed to be capable of reducing drives or secondary drives
resistance to extinction
the persistence of formerly reinforced behavior in the face of no reinforcement; the number of responses made by a subject after reinforcement for responding is discontinued; this was used by Hull and by many other theorists as a measure of the degree of learning produced by a training procedure
habit
Hull's intervening variable representing "what is learned"/the effects of learning; symbolized as H or sHr; strength depnds upon reinforced occurrences of responses in the presence of stimuli; each reinforcement added an increment of habit strength that was a function of the maximum habit strength possible (M), minus the amount of habit strength already existing
transposition
responding to relationships between stimuli rather than to absolute values of stimuli; for example, id I learn to choose a larger circle over a smaller circle and am then presented with the larger and one a bit larger than it, I show this if I choose the larger of the two; my behavior is controlled by the relationship of size (larger), rather than by the definite size of the specific circle; the Gestaltists stressed this, which posed quite a problem for Hull until Spence ingeniously proposed his theory of generalization and discrimination learning
independent variable
the objective measure of the "causal agent" in an experiment; normally refers to the manipulation done by the experimenter or to some other causal event in a cause-effect sequence; if an experimenter deprives you of food or water, he or she is manipulating these variables that will produce a change in your eating and drinking behavior; the changes in your behavior are recorded as dependent variables
crucial experiment
an experiment that yields results that clearly and unambiguously show that a hypothesis is true or false; there are many such experiments in the physical sciences, such as the Michaelson-Morely experiment, which convinced most observers that the theory that held that pace is filled with ether was definitely untrue; Hull's hypothetico-deductive approach assumed that such experiments could be done more easily than is the case; even in the physical sciences, such experiments appear relatively infrequently
immanent determinants
Tolman: the emergent, functional properties characteristic of molar acts: "a functionally defined variable (purposive or cognitive) which is inferred as immanent or 'lying' in a behavior-act"; what Tolman refers to here are the unique characteristics of molar (goal-directed) behavior: the chief examples are purposes and congitions
Edward C. Tolman
Berkeley learning theorist influenced by Gestaltists; "Purposive Behavior"
expectancy
Tolman: the cognition that, given a sign, a "significate" will follow; the cognition that, given a present sign, a second event will follow or that, given a present sign, if I do such and such a second event will follow; a means-ends readiness is an expectancy, as is a hypothesis and a sign-Gestalt expectation
sign
events that set the occasion for acts that produce specific results; 1 of the 3 parts of the sign-gestalt, along with the means-ends readiness and the signified (goal) object; a colored panel that signals impending food is a sign
latent learning
learning without reward that remains "dormant" until a reward is available; rats wandering through a maze still learn something about it, even when there is no reward in the goal box; when the goal box is then baited, the learning previously gained is shown in more rapid learning to reach the goal box than is true for inexperienced rats
molecular behavior
a specific elemental movement or muscle action; e.g., a twitch or reflex; "a conception of behavior which stresses its underlying physical and physiological character"; these views tend to emphasize natural elementary units of behavior, such as the reflex, conditioned reflex, or habit
molar behavior
an act or set of acts defined by the goal or consequence attained; "any organic activity the occurrence of which can be characterized as docile relative to its consequences"; to be docile, a behavior must be teachable, or improvable as a function of its consequences; this amounts to goal-directed behavior
insight
the sudden solution of a problem via the natural emergence of a "Good Gestalt"; the sudden solution of a problem in the apparent absence of much prior practice with very similar problems
place learning
Tolman: learning the location of specific objects in space; this is the chief form of learning in a theory that posits cognitive maps
purpose
Tolman: the "getting to" or "getting away from" property of behavior; "a demand to get to or from a given type of goal-object"; such behavior shows persistence, as in continued trial and error, and docility, or improvement with practice
equivalence beliefs
treating a subgoal the same as a goal object; Tolman's secondary reinforcement; thus, I may treat a restaurant sign as equivalent (in a sense) to the food that I have found inside it
demand
Tolman's term for the motivational state energizing purposeful behavior; "an innate or acquired urge to get to or from some given instance or type of environmental presence or of physiological quiescence or disturbance"; Tolman treats this as synonymous with purpose
spatial memory
the capacity to appropriately return or not return to goal locations
figure/ground
the Gestalt principle concerning a basic, automatic perceptual process; basic Gestalt principle that says that the elementary unit of experience is in the form of figures (objects) on backgrounds; this is also the way that the Gestaltists treated attention
response equivalence
the substitutability of different responses that produce the same response; the successful performing of learned behavior even when the specific actions required to carry out the act are greatly altered; a rat which had learned to run a maze may as skillfully swim to the goal box when the maze is flooded
Gestalt psychology
a 20th century "school" emphasizing innate "organizational processes"; school of psychology that stresses innate organizing powers in perception, learning, and memory; it should not be confused with Gestalt therapy
means-ends readiness
Tolman: a cognition about how to obtain a particular goal object; abbreviated to MER; "it is equivalent to a judgement that commerce-with such and such a type of means object should lead on by such and direction-distance relations to some instance of the given demanded type of goal-object"
cognitive map
a mental representation of the spatial relationships in one's surroundings; an individual's surroundings, including features both near and distant, as represented within him or her (that is, centrally); the ability to take a shortcut is often cited as evidence for cognitive maps in animals
cognition
a piece of knowledge about the world that determines outward behavior; we say that this is present in behavior when the effect of a specific means-ends readiness is evident; this, a rat may repeatedly run to a specific goal box in which it finds food; if the environment is altered so that food is no longer present, the behavior will be disrupted and learning will occur; in general, cognition refers to the appreciation of the location of significant objects and the reflection of this in behavior
vicarious trial and error
Tolman: behavioral vacillation at a "choice point" prior to choosing
behavior adjustment
"the non-overtly observable surrogate for an actual running-back-and-forth ... The important thing about them is that, whatever they are, sub-vocal speech, minimal gestures or what not, they achieve the same 'sampling' of alternatives or succedents which actual runnings-back-and-forth in front of such alternatives or succedents would have achieved"; thus, these occur when a choice is to be made; for example, vicarious trial and error occurs preceding the actual overt choice; the subject is running its cognitive map
cathexes
a cathexis is the learned association of a demand (drive) with an external object; we learn that external objects such as food items or aspirin reduce demands produced by hunger or pain
consciousness
for Tolman, the "process of running-back-and-forth in front of environmental objects, placed as alternatives or succedents"; thus, this is essentially a form of behavior adjustment
docile
the characteristic of molar behavior that is better termed "modifiability" or "teachability"; if a given behavior does not get the organism to a demanded goal object or if it involves a long route, the behavior will change is such a way as to get to the goal object or to use a shorter route to get there
field cognition modes
another term for the moods of the sign-Gestalt expectation - that is perception, mnemonization, and inference
fixation
a rigid (nondocile) connection between a sign and a goal object; a rat may persist in a position habit, such as always choosing the left alley in a maze, even though it is never the correct choice; similarly, human sexual perversions may be viewed as the attachment of the sex drive to inappropriate signs
Gestalt(en)
literally means form or configuration in German; the Gestalt psychologists believe that innate laws of Pragnanz organize the world into Gestalten
higher order drive
"certain secondary demands and sign-Gestalt readiness" (e.g., gregariousness, curiosity, imitation, self-assertion, self-abasement, etc.); such drives were presumed to be largely learned and dependent upon primary drives
hypothesis
a provisional expectancy, which may become a sign-Gestalt expectation if it regularly produces goal-objects
inference
one of the moods of the sign-Gestalt expectation: "in inference commerce with the sign object only has ever occurred before": for example, after experience in a complex maze, a rat may be able to circumvent a block placed placed in its customary path by using a roundabout path; a familiar sign, such as the sight of the entrance to the blocked path, had previously been the signal to enter the path; in this case, it acts as a signal for a new behavior, the taking as evidence for the place learning were used by Tolman as evidence for inferential behavior
means-ends capacities
"the innate (and acquired) capacities whereby a given organism or species is capable of having commerce-with and expecting means-end-relations"; the latter refers to the sensitivity of the organism to direction, distance, similarity, multiple trackness, and so on
mnemonization
1 of 3 moods of the sign-Gestalt expectation; in this case, a sign is present, but the goal object is not
motor patterns
a principle of learning added to Tolman's theory in 1949; this consists of Guthrie's theory of contiguity learning as an account for the way in which cognitions are translated to action
perception
1 of 3 moods of the sign-Gestalt expectation; in this case, all relevant stimuli (sign, goal object, and so on) are present
primacy
a perhaps enduring effect produced by the first exposure to a new situation; for example, one's first experiences with a maze (or a bicycle or a spider) may have lasting effects even though more recent experiences were quite different
regression
the often pathological return to earlier modes of behavior; a middle-aged person may begin to think and act as he or she did as an adolescent or even as an infant
signified sign-Gestalt expectation
a sign-Gestalt expectation that consists of sign, means-end readiness, and goal object, when a demand for the goal object is present and the goal object is this signified
spatial orientation
another term for place learning
Keller & Marian Breland
early students of Skinner who pioneered the field of practical animal training
concurrent schedule
two or more mutually-exclusive behavior options that are available simultaneously; schedule of reinforcement in which two or more independent schedules are simultaneously in effect, with a lever or key corresponding to each; for example, two different VI schedules may be available, with two response keys present, one for each schedule; a subject may get into the habit of switching keys, since the longer the time spent on one key, the more likely reinforcement becomes for responses on the second key; switches between keys therefore may be reinforced; to prevent this, a changeover delay (COD) is usually used with this type of schedule; a COD prevents the receipt of reinforcement for responding on a key for some fixed time (a few seconds) after a switch from the other key to that key
fixed-ratio (FR) schedule
a reinforcer is given after a constant number of responses has been made; schedule of reinforcement that requires that a set number of responses occur to produce each reinforcement
instinctive drift
the intrusion of species-specific behaviors into learned response performance; name given by the Brelands (1961) to the disruptive effect of species-specific (instinctive) consummatory behavior on the learned (operant) performances of their animal subjects
discriminative stimulus
a stimulus that controls behavior due to correlation with reinforcement; operants usually are reinforced only in the presence of some class of stimuli, the SD; Skinner believes that such stimuli come to "set the occasion" for reinforcement, thus acting as discriminative stimuli; he also believes that stimuli that elicit reflex behavior or conditioned reflex behavior operate differently, by eliciting the responses which they control
B.F. Skinner
prominent Harvard behaviorist; articulated "radical behaviorism"
methodological behaviorism
use behavioral technology but pursue mentalistic theories; name given by Skinner (1945) to the view that holds that we can deal objectively only with observable behavior and that min exists, but cannot be meaningfully studied; Skinner opposed this view and labeled his contrary position radical behaviorism
negative reinforcement
response strengthened by contingent removal or the avoidance of something; an increase in the frequency of a response that produces the offset of something; like negative punishment, this involves the termination of something; like positive reinforcement, it leads to an increase in responding; when we pull a window shade to stop the sun from shining in our eyes, this type of reinforcement that results makes it more probable that we repeat that act the next time the sunlight annoys us
conditioned reinforcer
a stimulus that reinforces due to its relationship to other reinforcers; according to some usages, such as that of Hull, all reinforcers that do not reduce drives arising from bodily needs are assumed to be these; their power derives from an association with primary reinforcers; Skinner's view defines these as things that act as reinforcers because of some pairing with already-effective reinforcers; the latter may act as such for a variety of reasons, and the difference between these and other reinforcers lies only in the fact that these act as such only after association with effective reinforcers

an acquired reinforcer; one that gains its reinforcing power during an individual's lifetime
matching law
the fact that the relative response rate matches the relative reinforcement rate; also called the molar law of effect; the matching law holds that relative response rates match relative reinforcement frequencies

the relative rate of responding corresponds to the relative rate of reinforcement; also called the molar law of effect; Herrinstein demonstrated that relative response rate is equal to (or matches) relative reinforcement frequency
empirical law of effect
the fact that current behavior is a function of its past consequences; Skinner's version of the law of effect, which holds that reinforcers need not share any properties except their ability to act as reinforcers; we discover that somethings acts as a reinforcer under given conditions; we do not concern ourselves over why it does so; for example, we do not ask whether it reduces drives or if it is pleasure producing; the law of effect is thus an empirical generalization; it is a useful rule that we have found to apply in a large number of cases, and its value and usefulness depend upon how far we may extend its application; why reinforcers act as they do is an unanswerable question
fixed-interval (FI) schedule
a reinforcer is once again available after a constant time has elapsed; schedule of reinforcement that requires that a set period of time pass, after which the first response produces reinforcement
differential reinforcement
response selection via different amounts or qualities of reinforcement; another way of describing the selecting of behaviors, depending upon reinforcement of some responses in the presence of some stimuli and the nonreinforcement of other responses in the presence of other stimuli; shaping, discrimination learning, and all of our learned behavior may be viewed as the result of this
negative punishment
response weakened by contingent removal or the avoidance of something; this occurs when a response decreases in frequency when its occurrence is followed by the offset of something; for example, a bit of misbehavior may be followed by the turning off of a television set; like negative reinforcement, the behavior removes some stimulus; like other cases of punishment, this produces a decrease in the frequency of the behavior which causes this consequence
homunculus
a generic term for the "little man" inside us; the "little man at the controls"
operant
a behavior subject to influence by its consequences; class of responses that vary together in strength as a function of the consequences produced by members of the class; pressing a lever acts as an operant, as does creative behavior, going to the store, and a myriad of other behaviors; we can identify an operant class only after we have observed that a given behavior is influenced by its consequences; it may take some time to identify most or all of the responses that make up an operant
inner man
a popular version of the homunculus; the "real me," my "self," the "I"
induction
Skinner's term for the "generalization" of the effects of reinforcement; actually means "generalization"; Skinner used this term, after Sherrington, to refer to induced changes in responding to one stimulus, as the result of reinforcement for responding to a similar stimulus
stimulus control
the effect of stimuli on behavior, especially operant behavior; the study of stimulus generalization and discrimination learning is now called the study of "stimulus control"
respondent conditioning
Skinner's name for Pavlovian or classical conditioning
variable-ratio (VR) schedule
reinforcers are given after different numbers of responses are made; schedule of reinforcement; this schedule provides reinforcement after the completion of some number of responses; the number varies from reinforcement to reinforcement, and the value of the schedule is the mean number of responses required; thus, a VR 25 schedule provides reinforcement after a varying number of responses, the average requirement being 25
schedule of reinforcement
the rule that relates the availability of reinforcer to responding; rule by which reinforcers are delivered; the rule may include resposne requirements (as in ratio schedules), temporal requirements (as in interval schedules), or both; the first major analysis of reinforcement schedules and their effects was published by Ferster and Skinner in 1957
theory (for Skinner)
translations of empirical relationships to imaginary construct domains; for Skinner, a translation of terms; for example, if we attribute intelligence to properties of the brain, information processing mechanisms, a "smarts" center (to use Stephen Gould's term), or the like, we are proposing a theory; more generally, theories involve the use of intervening variables, rather than the independent and dependent variables we should be interested in; acceptable explanations refer only to the phenomena to be explained and the concrete conditions that influence them; the translation of these basic terms into hypothetical entities (habits, motives, and so on) should be avoided
shaping
the gradual development of complex behavior via differential reinforcement; commonly used term for the method of successive approximations; this method involves the selective reinforcement of some subset of a class of operant responses; this leads to extinction of the nonreinforced members and a consequent increase in the variations of behaviors emitted; the requirement for reinforcement may be progressively restricted until the final product is a set of behaviors very different from the original behavior class
response class
a set of behaviors that change in frequency together; this applies to reflex, conditioned reflex, and operant behavior; may be composed of members that do not intuitively seem to go together; for example, aggression is no doubt a number of response classes, some controlled by external stimuli, and others by consequences; each class may contain members discoverable only after long observation of behavior under a variety of conditions
positive reinforcer
an event that, when contingent on response, strengthens that response; a consequence of behavior that produces an increase in the frequency of that class of behavior
variable-interval (VI) schedule
reinforcers become available after different amounts of time have elapsed; schedule of reinforcement; this schedule provides reinforcement for the first response that occurs after some mean interval of time has passed since the last reinforcement; for example, a VI 6 minute schedule would reinforce the first response occurring after an average of 6 minutes; some interreinforcement intervals could be as short as a few seconds and others could be 10 minutes long or longer
operant conditioning
the process whereby behavior changes because of its consequences; the process whereby an operant class is shown to become more frequent (that is, to increase in strength) as a function of the consequences it produces; thus, a rat may more frequently press a lever when presses are reinforced with food, and an infant may increase its emission of vocalizations are followed by praise and attention
token economy
desired behaviors earn credits that are exchangeable for goods and privileges; method of psychotherapy originated by Ayllon and Azrin in the early 1960s; patients, for whom other methods of therapy had failed, were reinforced grooming, working, eating in an acceptable manner, and other activities
positive punishment
response weakened by contingent presentation of something
stimulus class
the set of stimuli that mutually control a given response class; the set of stimuli that may be shown to control a reflex, or operant class of responses; like the response class, this class may be composed of elements that may not seem intuitively obvious; what we call "concepts" are names for stimulus classes
Type S
one of Skinner's other names for the process of respondent conditioning; Skinner's term for classical conditioning, in which the emphasis is placed on the eliciting CS rather than upon the consequences of the elicited behavior
positive reinforcement
response strengthened by contingent presentation of something
theoretical law of effect
various hypothetical explanations of the empirical law of effect; attempts to explain the fact that reinforcers works as they do by postulating some underlying process; for example, Hull's suggestion that reinforcers work by reducing biological drives was this; one could suggest that all reinforcers promote survival, produce pleasure, or share some other characteristic; all such attempts have failed, and Skinner believed that we only waste time and effort by trying to work out this; his reasons are the same as those he uses to argue against theories of any kind
unfinished causal sequences
Skinner's term for explanations involving hypothetical inner states; Skinner's term for the common practice of explaining behavior and experience by reference to some hypothetical inner state or process; hence, we may explain unruly behavior as the product of aggressiveness and the ability to recite well as the result of a good memory; in both cases, we have done no more than name the behavior involved and, unless we explain aggressiveness and memory, we are left with these, not real explanations
Type R
one of Skinner's other names for the process of operant conditioning; Skinner's term for behavior that is sensitive to its consequences - that is, operant behavior; this type of conditioning is therefore operant conditioning
radical behaviorism
the philosophy that a science of behavior is feasible and practical; position described by Skinner in 1945 and 1963, which proposes a philosophy for a science of psychology, independent of specific theories of learning, whether Skinner's or anyone else's; according to this view, the entire subject matter of psychology may be treated as activity (behavior) and therefore mental activity is essentially the same in kind as physical activity; we may speak of thinking and seeing as behaviors, just as we do when we speak of walking and talking; argues against the usefulness and the existence of intervening variables and especially of internal copies of the world; some writers have pointed out the similarities between this point of view and that of modern European phenomenologists, such as Merleau-Ponty and Sartre; there is little doubt that Skinner named this view after the radical empiricism of William James
adjusting schedule
schedule of reinforcement in which the value of the interreinforcement interval or the response requirement changes as times passes since the last reinforcement; for example, this schedule could begin with a value of FR 10 and increase that value by 10 every 20 seconds; this should lead to high response rates, since, in effect, the schedule penalizes low response rates
alternative schedule
schedule of reinforcement in which reinforcement depends on the passage of time or the fulfilling of a response requirement
chaining (chained schedule)
the joining together of a sequence of behaviors by a series of discriminative stimuli that also act as conditioned reinforcers; the occurrence of the first behavior produces the Sd for the second member of the chain, and so on; the appearance of the Sd acts to reinforce the first behavior and set the occasion for the second member, which leads to the appearance of the second Sd
component
basic unit of a multiple schedule, consisting of a specific Sd, schedule of reinforcement, and duration; are presented successively; one example would be a two-minute period in which a green light was lighted and in which a VR 25 schedule was in effect
concept learning
discrimination learning in which the class of stimuli involved do not consist of specific concrete things like lights and tones; the concept may be "all four-legged creatures" or all true statements; we train concepts by presenting numerous examples of instances of the concept and reinforcing responses by our subject and by presenting non-instances without reinforcement
conjunctive schedule
reinforcement schedule in which reinforcement is delivered only after the passage of time and the completion of a response requirement; for example, 47 responses may be followed by reinforcement if 2 minutes have passed since the last reinforcement
contingencies
another word for schedule; when Skinner speaks of these that are responsible for the development or maintenance of behavior, he refers to the requirements that govern the delivery of reinforcement; these may depend upon the passage of time, the occurrence of specific responses, the presence of specific stimuli, or a combination of these things
DRH schedule
schedule of reinforcement that reinforces high rate of responding; a differential reinforcement of high rates schedule requires that some number of responses occur within a fixed time period if reinforcement is to be received
DRL schedule
schedule of reinforcement that reinforces low rates of responding; a differential reinforcement of low rates schedule requires that no responses occur during some fixed period since the last response if reinforcement is to be received
dynamic laws
factors that influence the strength of a reflex, conditioned reflex, or operant response, which show their influence as the behavior repeatedly occurs; for example, the law of fatigue and the laws of conditioning and extinction are such laws
fixed-time schedule
schedule of reinforcement that provides reinforcement after fixed periods of time since the last reinforcement, independent of the subject's behavior; this schedule is indistinguishable from classical delayed conditioning; symbolized as FT
heterogeneous chain
chain of behaviors that are not of the same topography; for example, the rat Pliny performed a chain consisting of pulling a string, carrying a marble, and dropping it
homogeneous chain
schedule of reinforcement composed of a chain of behaviors in which the behaviors in each member of the chain are topographically similar; for example, a chain FR 2 VR 8 schedule requires that two lever presses occur in the presence of one stimulus, which leads to a change in the Sd, and the pressing an average of 8 time then leads to reinforcement; the response required in both members (for example, lever pressing) is the same, unlike in a heterogeneous chain
limited hold
a requirement that may be added to a VI or FI schedule, such that a response must occur within some set period of time (the LH value) after the schedule has made reinforcement available; for example, on a FI one-minute LH two-second schedule, reinforcement is available for two seconds after the passage of one minute; if a response does not occur, the reinforcement is lost
mixed schedule
schedule of reinforcement in which two or more different schedule are in effect for set periods of time, these periods appearing in sequence; for example, a mixed FI 2 VI 3 schedule might alternate two-minute periods, in which FI 2 and then VI 3 schedules were in force; if these periods are signaled by discriminative stimuli, we have a multiple schedule
multiple schedule
a mixed schedule in which different Sds signal the schedules that are in effect
punishment
the decrease in frequency of an operant behavior as a function of the consequences it produces; Skinner argues that this is not a basic effect and that effects attributed to it depend upon side effects of the aversive events used as punishers; we now know that he was mistaken and that punishers seem to work in a way opposite to the effects of reinforcers

for Guthrie, the evocation of incompatible behavior in the presence of old cues; for Guthrie, a change in behavior produced by stimuli that lead to a new behavior in the presence of old cues; what is learned is what is done, and punishment works when it leads to a new behavior incompatible with previous, recent behavior; for example, electric shock may be used to train a dog or jump through a hoop, providing that the shock is applied to the rear end of the dog

a decrease in the probability of a behavior owing to the consequences it produces; electric shock produced by a response is a commonly used punisher
respondent
Skinner's term for reflex and conditioned reflex behavior, which is elicited by an identifiable stimulus; respondent conditioning is his term for Pavlovian conditioning, or conditioning of Type S
static laws
in Skinner's early papers on the identification of reflexes, these referred to changes in responsiveness visible with single elicitations of the reflex; for example, we may see an increase in the magnitude of the response as we apply stronger eliciting stimuli; on a given occasion, a stronger stimulus produces a stronger response, irrespective of when the last stimulation was given
tandem schedule
two or more schedules of reinforcement arranged in sequence, so that the requirement of one schedule must be met before the next schedule begins; food or other reinforcement is delivered only after all schedules in the sequence have run; if different stimuli signal the successive schedules, it is a chained schedule
local contrast
a short-term high (low) response rate following a period of low (high) output; a sequential effect the occurs during discrimination learning; see Pavlovian induction
taste-aversion learning
the learned tendency to escape and avoid specific gustatory stimulation
opponent-process model
Solomon & Corbit's theory of reciprocal emotional interaction and change; theory of acquired motivation; according to this model, strong stimulation produces either positive or negative reactions (the A state) as well as compensating reaction in the opposing direction; with repeated stimulation, the opposing reaction (the B state) increases in strength and subtracts from the A reaction; the model provides a plausible account of the mechanisms involved in risk taking, drug addiction, and other phenomena
overshadowing
a conspicuous stimulus prevents learning about a less conspicuous one; effect first reported by Pavlov, who found that when more than one CS is presented at the same time, the more salient one may "overshadow" the less salient one, leaving the latter ineffective
response deprivation
a restriction on the normal availability of a behavioral opportunity; name given by Timberlake and Allison to their revision of the Premack Principle; according to this theory, reinforcement occurs when a behavior is restricted so that it can occur only at less-than-baseline level when a second behavior is required for access to the restricted behavior; the second behavior will increase in frequency (be reinforced) as long as the restricted behavior remains available at less than its baseline (unrestricted) level
Rescorla-Wagner model
a mathematical model of the associative learning process; theory of classical conditioning published by Rescorla and Wagner in 1972; according to this theory, conditioning always involves a compound CS and a limited amount of available associative strength, determined by the nature of the UCS; associative strength gained by one element of the compound CS detracts from that available to other elements; increments in associative strength on successive trials depend upon the maximum (asymptotic) level possible and the amount of associative strength already gained; the model accounts for overshadowing and blocking, for which it was formulated, as well as for other phenomena
passive avoidancce
another name for the effect of a positive punishment contingency; avoidance of noxious stimulation achieved by not responding; we passively avoid burns by not placing our hands on fires; the term was often used in place of the term punishment during several decades when it was believed that punishment was ineffective
transsituational
Meehl: the view that if something reinforces in one circumstance, it will do son in all; term used by Meehl in 1950 to refer to the power of a reinforcer of one behavior (such as bar pressing) to act the same in different situations and with different behaviors (such as key pecking)
Premack Principle
higher probability behaviors will reinforce lower ones, but not vice versa; revision to the law of effect firs tproposed by David Premack in 1959; according to this view, activities in which we engage may be ranked along a continuum of value; reinforcement occurs when lower-valued activities provide access to higher-valued activities; when a higher-valued activity produces access to lower-valued activities, the former decrease in frequency (are punished)
response language
the name Premack gave to his way of viwing the law of effect; instead of speaking of behaviors producing reinforcing stimuli, such as food, Premack spoke of behaviors producing access to other behaviors, such as eating
value
the basic intervening variable in Premack's theory; differentiates behaviors; refers to the attractiveness of activities and may be assessed in terms of time spent in one or another activity
phobia
an irrational and extreme fear of a normally neutral object or situation; an unnatural and usually unreasonable fear of common stimuli; agoraphobia, for example, refers to the fear of open spaces; siderophobia is a fear of railroad trains
ICS
intracranical stimulation; reinforcing electrical stimulation of the brain
John Garcia
discovered conditioned taste aversion phenomena
A state
Solomon & Corbit: the first reaction to an unconditioned stimulus; initial reaction to an affect-producing UCS, such as electric shock; with repetition, the A state diminishes in strength; this occurs because the opponent B state increases in strength over trials
Robert Rescorla
a prominent modern learning theorist, math modeler, and analytic genius
habituation
a decrease in unconditioned response strength with repeated elicitation; for example, the startle response to a gun shot diminishes with repetition
conservation
total output remains constant while the distribution of output changes; principle applied to the allocation of activities under schedule constraints; Allison and his colleagues have shown that total behavior per session remains constant under many conditions; thus, the more of one activity means less of some others
B state
Solomon & Corbit: the second and opposite reaction to an unconditioned stimulus; term for the opponent reaction produced when the A state is produced; if the A reaction appears as an increase in heart rate when shock is applied, this reaction is a decrease in heart rate; is assumed to increase in strength with repetition and to thus eventually reduce the A state to the diminished A' state
behavioral contrast
opposite rate change in one stimulus after schedule change in another; an increase or decrease in response rate to a stimulus when a change in conditions occurs during a second stimulus; when reinforcement rate is decreased for responses to a red light, response rate may increase greatly during green, even though there is no increase in reinforcement frequency in green
blocking
earlier conditioning to one element prevents later conditioning to another; term used by Kamin to refer to the effect of pretraining with one CS on subsequent failure to establish a CR to a second stimulus presented along with the original CS; in terms of the Rescorla-Wagner model, the first CS gains the majority of the associative strength available, preventing additional conditioning to the added CS
autoshaping
the development of operant-like behaviors via respondent procedures
Richard Solomon
the principal theorist of the opponent-process model
contingency
an overall correlation between either two stimuli or a behavior and an event; contingent events have an if/then relationship; as an example, a CS which unequivocally predicts a UCS has a clear contingent relationship with the UCS; a CS which is always followed by a UCS does not have such a contingent relationship if the UCS also appears at other times
Nathan Azrin
conducted extensive and definitive studies of punishment in the 1960s
James Olds
discovered reinforcing electrical stimulation of the brain
Richard Herrnstein
Harvard psychology professor who first described the matching law
David Premack
U of M grad who originated the relativistic approach to reinforcement
commensurability of units
the major problem with Premack's Principle, as originally proposed, in which widely different activities are scaled on a common continuum of value; it become difficult to assume that one unit of an activity is commensurate, or translatable, into one unit of another activity; for example, is one minute spent eating commensurate with one minute of reading?
primary reinforcer
a reinforcer that acts as such in the absence of prior experience with it; its power does not depend upon learning; food and water are usually viewed as these
classical conditioning
learning to make an involuntary (reflex) response to a stimulus other than the original, natural stimulus that normally produces the reflex
unconditioned stimulus (UCS)
a naturally occurring stimulus that leads to an involuntary (reflex) response
unconditioned response (UCR)
an involuntary (reflex) response to a naturally occurring or unconditional stimulus
neutral stimulus (NS)
stimulus that has no effect on the desired response
conditioned stimulus (CS)
stimulus that becomes able to produce a learned reflex response by being paired with the original unconditioned stimulus
conditioned response (CR)
learned reflex response to a conditioned stimulus
stimulus generalization
the tendency to respond to a stimulus that is only similar to the original conditioned stimulus with the conditioned response
stimulus discrimination
the tendency to stop making a generalized response to a stimulus that is similar to the original conditioned stimulus because the similar stimulus is never paired with the unconditioned stimulus
extinction
the disappearance or weakening of a learned response following the removal or absence of the unconditioned stimulus (in classical conditioning) or the removal of a reinforcer (in operant conditioning)
spontaneous recovery
the reappearance of a learned response after extinction has occurred
higher-order conditioning
occurs when a strong conditioned stimulus is paired with a neutral stimulus, causing the neutral stimulus to become a second conditioned stimulus
conditioned emotional response (CER)
emotional response that has become classically conditioned to occur to learned stimuli, such as a fear of dogs or the emotional reaction that occurs when seeing an attractive person
vicarious conditioning
classical conditioning of a reflex response or emotion by watching the reaction of another person
conditioned taste aversion
development of a nausea or aversive response to a particular taste because that taste was followed by a nausea reaction, occurring after only one association
biological preparedness
referring to the tendency of animals to learn certain associations, such as tase and nausea, with only one or a few pairing dues to the survival value of the learning
stimulus substitution
original theory in which Pavlov stated that classical conditioning occurred because the conditioned stimulus became a substitute for the unconditioned stimulus by being paired closely together
cognitive perspective
modern theory in which classical conditioning is seen to occur because the conditioned stimulus provides information or an expectancy about the coming of the unconditioned stimulus
operant conditioning
the learning of voluntary behavior through the effects of pleasant and unpleasant consequences to responses
law of effect
law stating that if an action is followed by a pleasurable consequence, it will tend to be repeated, and if followed by an unpleasant consequence, it will tend to not be repeated
operant
any behavior that is voluntary
reinforcement
any event or stimulus that, when following a response, increases the probability that the response will occur again
reinforcers
any events or objects that, when following a response, increases the likelihood of that response occurring again.
primary reinforcer
any reinforcer that is naturally reinforcing by meeting a basic biological need, such as hunger, thirst, or touch
secondary reinforcer
any reinforcer that becomes reinforcing after being paired with a primary reinforcer, such as praise, tokens, or gold stars
positive reinforcement
the reinforcement of a response by the addition or experience of a pleasurable stimulus
negative reinforcement
the reinforcement of a response by the removal, escape from, or avoidance of an unpleasant stimulus
partial reinforcement effect
the tendency for a response that is reinforced after some, but not all, correct responses to be very resistant to extinction
continuous reinforcement
the reinforcement of each and every correct response
fixed interval schedule of reinforcement
schedule of reinforcement in which the interval of time that must pass before reinforcement becomes possible is always the same
variable interval schedule of reinforcement
schedule of reinforcement in which the interval of time that must pass before reinforcement becomes possible is different for each trial or event
fixed ratio schedule of reinforcement
schedule of reinforcement in which the number of responses required for reinforcement is always the same
variable ratio schedule of reinforcement
schedule of reinforcement in which the number of responses required for reinforcement is different for each trial or event
punishment
any event or object that, when following a response, makes that response less likely to happen again
punishment by application
the punishment of a response by the addition or experience of an unpleasant stimulus
punishment by removal
the punishment of a response by the removal of a pleasurable stimulus
discriminative stimulus
any stimulus, such as a stop sign or a doorknob, that provides the organism with a cue for making a certain response in order to obtain reinforcement
shaping
the reinforcement of simple steps in behavior that lead to a desired, more complex behavior
successive approximations
small steps in behavior, one after the other, that lead to a particular goal behavior
instinctive drift
tendency for an animal's behavior to revert to genetically controlled patterns
behavior modification
the use of operant conditioning techniques to bring about desired changes in behavior
token economy
type of behavior modification in which desired behavior is awarded with tokens
applied behavior analysis (ABA)
modern term for a form of functional analysis and behavior modification that uses a variety of behavioral techniques to mold a desired behavior or response
biofeedback
use of feedback about biological conditions to bring involuntary responses, such as blood pressure and relaxation, under voluntary control
neurofeedback
form of biofeedback using brain-scanning devices to provide feedback about brain activity in an effort to modify behavior
Classical Conditioning
learning to make an involuntary (reflex) response to a stimulus other than the original, natural stimulus that normally produces the reflex
Unconditioned Stimulus (UCS)
a naturally occurring stimulus that leads to an involuntary (reflex) response. . The dog food
unconditioned response (UCR)
an involuntary (reflex) response to a naturally occurring or unconditioned stimulus. unlearned and occurs because of genetic wiring.
Neutral stimulus (NS)
stimulus that becomes able to produce a learned reflex response by being paired with the original unconditioned stimulus. ex dogs seeing dish and began to salivate. the dish then became the conditioned stimulus
conditioned response
learned reflex response to a conditioned stimulus
Stimulus
any object, event, or experience that causes a resonse, the reaction of an organism
unconditioned response
reflex response that is naturally produced
Unconditioned
Any of the responses or stimuli that is natural or genetic.
Acquisition
the repeated pairing of the NS and the UCS is usually called ____________, because the organism is in the process of learning.
The CS must come before the UCS.
If Pavlov rang the bell just after he gave the dog food, they did not become conditioned.
The CS and UCS must come together very close in time, ideally no more than 5 seconds apart.
When Pavlov stretched the time between the potential CS and the UCS to several minutes, no association or link between the two was made.
Interstimulus Interval (ISI)
time between the CS and UCS
the neutral stimulus must be paired with the UCS several times, often many times before conditiong can take place
...
the CS is usually some stimulust that is distinctive or stand out from other competing stimuli
The bell was a sound that was not normally present in the lab and therefore distinct
Conditioned Stimulus
stimulus that becomes able to produce a learned reflex response by being paired with the original unconditioned stimulus. the bell
Conditioned Response
learned reflex response to a conditioned stimulus.
Stimulous Generalization
the tendency to stop making a generalized response to a stimulus that is similar to the original conditioned stimulus because the similar stimulus is never paired with the unconditioned stimulus. EX. although the sound of the coffee grinder might produce a little anxiety in the dental-drill hating person, after a few uses that sound will no longer prouce anxiety because it isn't associated with dental pain.
Extinction
The disappearance or weakening of a learned response following the removal or absence of the unconditioned or the removal of a reinforcer (in operant conditioning)
Reinforcer
any event or object that when following a resonse, increases the likelihood of that response occurring again
spontaneous recovery
the reappearance of a learned response after extinction has occurred. the conditioned response can briefly reappear when the original CS returns, although the response is usually weak and short lived.
Higher Order Conditioning
Another concept of classical conditioning. Occurs when a strong conditioned stimulus is paired with a neutral stimulus causing the neutral stimulus to bcome a second conditioned stimulus. Pavlov would snap his fingers (neutral stimulus) then ring bell (conditioned stimulus) and give food to produce saliva ( conditioned response) the snap when then trigger saliva if paired enough times turning the snap into another conditioned stimulus. Without the UCS the higher order conditioning would be difficult to maintain and would gradually fade away
unconditioned
"unlearned" or "naturally occurring"
Conditioned emotional response
emotional response that has become classically conditioned to occur to learned stimuli, such as a fear of dogs or the emotional reaction that occurs when seeing an attractive person
Vicarious Conditioning
Classical conditioning of a reflex response or emotion by watching the reaction of another person. kids in line to get shot. eventually one kid starts to cry by the end of the line all of the kids will be crying
conditioned taste aversion
development of a nausea or aversive response to a particular taste because that taste was followed by a nausea reaction, occurring after only one association
biological preparedness
referring to the tendency of animals to learn certain associations such as taste and nausea with only one or few pairings due to the survival values of the learning
stimulus substitution
original theory in which Pavlov started that classical condition occurred because the condition stimulus became a substitute for the unconditioned stimulus by being paired closely together
cognitive perspective
modern theory in which classical conditioning is seen to occur because the conditioned stimulus provides information or an expectancy about the coming of the unconditioned stimulus. shocking of rats before tone and after. rats could predict shock and they would freak out. robert rescorla
Operant Conditioning
the learning of voluntary behavior through the effects of pleasant and unpleasant consequences to responses
Law of Effect
law stating that if an action is followed by a pleasurable consequence, it will ted to be repeated, and if followed by an unpleasant consequence it will tend not to be repeated. Thorndike developed
operant
any behavior that is voluntary
reinforcement
any event or stimulus, that when following a response , increases the probability that the response will occur again
primary reinforcer
any reinforcer that is naturally reinforcing by meeting a basic biological need, such as hunger, thirst, or touch. such as a candy bar (hungry drive) or touch (pleasure drive) or liquid (thirst drive)
secondary reinforcer
any reinforcer that become reinforcing after being paired with a primary reinforcer such as praise, tokens, or gold stars, or money
positive reinforcement
the reinforcement of a response by the addition or experiencing of a pleasurable stimulus
negative reinforcement
the reinforcement of a response by the removal , escape from or avoidance of an unpleasant stimulus
punishment
any event or object that when following a response makes that response less likely to happen again
punishment by application
the punishment of a response by the addition or experiencing of an unpleasant stimulus. Something unpleasant such as spanking
punishment by removal
the punishment of a response by the removal of a pleasurable stimulus after the behavior occurs..grounding, no tv. this is more acceptable to child development
successive approximations
small steps in behavior one after the other, that lead to a particular goal behavior
discriminative stimulus
any stimulus such as stop sign or a doorknob, that provides the organism with cue for making a certain response in order to obtain reinforcement. baby calls all males dada, then when other males do not act positively and only the dada reinforces the behavior he becomes this
Thorndike's Puzzle Box
lever is the stimulus, the pushing of the lever is the response, and the consequence is both escape and food. from this he developed the laws of effect
BF Skinner
Came up with the term operant conditioning . Behaviorist
antecedent stimulus
antecedent means something that comes before another thing
Operant Conditioning Learning
depends on what happens after the response( in classical conditioning it's what happens before the response)
Classical Conditioning
Classical Conditioning or Operant Conditioning. End result is the creation of a new response to a stimulus that did not normally produce that response
Operant Conditioning
Classical Conditioning or Operant Conditioning. An expectancy develops for reinforcement to follow a correct response
Classical Conditioning
Classical Conditioning or Operant Conditioning. An expectancy develops for UCS to follow CS
Operant Conditioning
Classical Conditioning or Operant Conditioning. Reinforcement should be immediate
Operant Conditioning
Classical Conditioning or Operant Conditioning. Responses are voluntary
Operant Conditioning
Classical Conditioning or Operant Conditioning. End results is an increase in the rate of an already occurring response.
Operant Conditioning
Classical Conditioning or Operant Conditioning. Consequeces are important in forming an association
Classical Conditioning
Classical Conditioning or Operant Conditioning. Responses are involuntary and reflexible.
Classical Conditioning
Antecedent stimuli are important in forming an association
Punishment should immediately follow the behavior it is meant to punish
if the punishment comes long after the behavior, it will not be associated with that behavior
Punishment should be consistent
first if parent promised certain punishment for behavior parent must follow through. second punishment should stay at the same intensity or raise slightly they should never decrease
Punishment of the wrong behavior should be paired, whenever possible, with reinforcement of the right behavior
if kid eats food with fingers, parent should take hand say no we eat with fork and then give kid fork
shaping
the reinforcement of simple steps in behavior that lead to desired more complex behavior. like elephants in the circus
partial reinforcement effect
the tendency for a response that is reinforced after some, but not all , correct responses to be very resistant to extinction
continuous reinforcement
the reinforcement of each and every correct response
fixed interval schedule of reinforcement
schedule of reinforcement in which the interval of time that must pass before reinforcement become possible is always the same . example paycheck
Variable interval schedule of reinforcement
schedule of reinforcement in which the interval of time that must pass before reinforcement become possible is different for each trial or event . example pop quizzes
Fixed Ratio schedule of reinforcement
schedule of reinforcement in which the number of responses required for reinforcement is always the same
variable ration schedule of reinforcement
schedule of reinforcement in which the number of responses required for a reinforcement is different for each trial or event . example slot machines
behavior modification
the use of operant conditioning techniques to bring about desired changes in behavior (and sometimes classical conditioning)
token economy
type of behavior modification in which desired behavior is rewarded with tokens
applied behavior analysis (ABA)
modern term for a form of behavior modification that uses shaping techniques to mold a desired behavior or response. Began with work by Lovaas
biofeedback
using of feedback about biological conditions to bring involuntary responses such as blood pressure and relaxation, under voluntary control
neurofeedback
form of biofeedback using brain-scanning devices to proivde feedback about brain activity in an effort to modify behavior
Interval schedule
when the timing of the response is more important
Ratio Schedule
when it is the number of responses that is important , because a certaing number of responses is required for each reinforcer
Insight
the sudden perception of relationships among various parts of a problem, to come quickly
Learned helplessness
the tendency to fail to act to escape from a situation because of a history of repeated failures in the past. depressed dogs do not escape shock
observational learning
learning new behavior by watching a model perform that behavior children beat up doll
learning / performance distinction
referring to the observation that learning can take place without actual performance of the learned behavior. children beat up doll
latent learning
learning the remains hidden until its application becomes useful the idea that learning could happen without reinforcement, and then later affect behavior, was not something traditional operant conditioning could explain. example tolmans rate maze second group would exit faster once reinforced
Wolfgang kohler
gestalt psychologist who became marooned on island in the caries when world war 1 broke out. stuck at the primate research lab that had first drawn him to the island
Martin seligman
famous for funding the field or positive psychology a new way of looking at the entire concept of mental health and therapy.
Bandura
concluded from these studies and others that observational learning required the presence of four elements. did clown beat up study
Attention
to learn anything through observation, the learner must first pay attention to the model to know which utensil to use has to watch the person who seems to know what is correct. People pay more attention to people that are similar to them or attractive
memory
the learner must also be able to retain the memory of what was done, such as remembering the steps in preparing a dish that was first seen on a cooking show
imitation
the learner must be capable of reproducing or imitating the actions of the model
motivation
learner must have desired motivation. if person given a reward they are much more likely to imitate
Four Elements
Attention, Memory, Imitation, motivation
Learning
any relatively permanent change in behavior brought about by experience or practice
Reflex
an involuntary response, one that is not under personal control or choice
instinctive drift
tendency for an animal's behavior to revert to genetically controlled patterns
Edwin R. Guthrie
Originated the All-or-none S-R assocaition-through-contiguity theory.
INCOMPATIBLE STIMULI
One of Guthrie's methods for eliminating habits; "counter-conditioning"
MAINTAINING STIMULI
Guthrie's concept for motivation or drive.
MOTIVE
Guthrie: a set of maintaining stimuli that instigate behavioral action.
PUNISHMENT
For Guthrie, the evocation of incompatible behavior in the presence of old cues.
Stereotypy
Precise repetition of the exact movement in a behavior.
TOLERATION
Gradually fading in bad-habit cues while the learner is behaving well.
STIMULUS SAMPLING THEORY
Estes' mathematical model of learning based on Guthrie's theory.
W.K. Estes
UofM alumnus who mathematicized Guthrie's theory
CONTIGUITY
Nearness in time or space; in learning, events going together in time.
SIDETRACKING
Guthrie: changing a habit by altering its terminal components
OVERCORRECTION
Changing a habit by repeatedly practicing the correct action in situ
NEGATIVE PRACTICE
Deliberately practicing a bad habit to discover what cues produce it.
REDINTEGRATION
Calling up an associative complex by one or more constituent elements.
EXHAUSTION
One of Guthrie's methods for eliminating habits; like "flooding"
Contiguity Theory
Closeness in time causes association in learning.
All-or-None Learning
association from suddenly to completely
Continuity Theory
changing gradually over time (or space); blending from one value to another
According to Guthrie, perception, imagery, memory, fantasy, and dreaming all depend upon present cues and upon the Answer that they bring about.
...
According to Guthrie's Answer principle, the last response made in a situation the last time that situation was encountered will be the first or most likely response made when that situation recurs.
...
After sitting on tacks once, we hesitate before sitting again. This is because the effect of the first sitting was to encourage standing up.
...
Select one:
...
True
...
False
...
Also known as the noncontinuity view, Guthrie's Answer learning principle holds that an association is either fully formed on the occasion of its initial learning, or it is not formed at all.
...
Behavior therapy is usually associated with Skinner, but may in fact owe more to Guthrie.
...
Select one:
...
True
...
False
...
Changing a bad habit by repeatedly practicing the correct behavior in the situation in which we wish it to occur: Answer.
...
A child who has a temper tantrum and throws his dishes and food is required to clean up the floor, wash the dishes, and put them away in the cupboard. This would be an application of the Answer method changing a bad habit.
...
Conquering a bad habit by repeatedly performing it in the presence of the cues that normally produce it is called Answer.
...
Drive stimuli was E. R. Guthrie's term for stimuli that give rise to activity that persists until they (the stimuli in question) are finally removed.
...
Select one:
...
True
...
False
...
The effect of rewards and punishers, said Guthrie, is dependent on what they make an individual do.
...
Select one:
...
True
...
False
...
The effect of rewards and punishers, said Guthrie, is ____________________.
...
Select one:
...
a. dependent on the pleasure or pain they produce.
...
b. dependent on their frequency and predict ability.
...
c. dependent on what they, as stimuli, make an individual do.
...
d. dependent on what they make the average individual do.
...
Feedback
...
Guthrie and Horton found that movements that did not aid in escape from the puzzle box decreased with training.
...
Select one:
...
True
...
False
...
For Guthrie, a Answer may be viewed as a set of maintaining stimuli that produce action until finally removed.
...
For Guthrie, anticipation, intention, foresight, and expectation are all dependent on memory traces.
...
Select one:
...
True
...
False
...
For Guthrie, anticipation, intention, foresight, and expectation are all dependent on ___________________.D
...
Select one:
...
a. memory traces.
...
b. conduction units.
...
c. foresight.
...
d. precurrent responses. XXXX
...
Guthrie believed that forgetting was essentially the same phenomenon as extinction.
...
Select one:
...
True
...
False
...
Guthrie believed that when Answer works to eliminate a behavior, it is not because the consequence weakens old associations that control the original response, as Thorndike had once taught, but because it causes the learner to react in ways that are physically incompatible with the previous behavior.
...
Guthrie believed that rewards and punishers are themselves in need of explanation.
...
Select one:
...
True
...
False
...
Guthrie's interpretation of "intention" is exactly what Thorndike meant by associative shifting.
...
Select one:
...
True
...
False
...
Guthrie's interpretation of motivation, compared with Watson's, is essentially the same.
...
Select one:
...
True
...
False
...
Guthrie's interpretation of motivation, compared with Watson's, is completely different.
...
Select one:
...
True
...
False
...
Guthrie's advice for training a dog to come on command illustrates his belief that the dog must be made to come on command.
...
Select one:
...
True
...
False
...
Guthrie's advice for training a dog to come on command illustrates his belief that _________________.
...
Select one:
...
a. an appropriate reward must be offered.
...
b. the dog must be properly motivated.
...
c. inappropriate punishment should not be used.
...
d. the dog must move toward you in contiguity with the word "come".XXX
...
Guthrie's method of Answer is one of his methods for changing habits. In this case, the behavior to be eliminated is repeatedly evoked in the presence of its usual stimuli until the learner is too fatigued to make the response. Then, whatever else it does, is contiguous with those stimuli and thereby replaces the undesired behavior.
...
One of Guthrie's methods for changing habits, in Answer one allows the movements that constitute the habit to begin, but then causes movements incompatible with the habit to follow. PUNISHMENT
...
Guthrie believed that when Answer works to eliminate a behavior, it is not because the consequence weakens old associations that control the original response, as Thorndike had once taught, but because it causes the learner to react in ways that are physically incompatible with the previous behavior.
...
Guthrie's method of exhaustion is currently called Answer.
...
Guthrie's murderer, who had actually "changed his mind," illustrates Guthrie's contention that precurrent activities may be both verbal and motor.
...
Select one:
...
True
...
False
...
Guthrie suggested (after Knight Dunlap) that we may often improve our skills by practicing mistakes, an odd suggestion given his theory. This is called Answer.
...
If a child throws a coat on the floor, rather than hanging it up, and we force her to hang it up and take it down several times, we are using the __________________.
...
Select one:
...
a. punishment procedure.
...
b. negative practice procedure.
...
c. fatigue procedure.
...
d. overcorrection procedure.
...
If a child throws a coat on the floor, rather than hanging it up and we force her to hang it up and take it down several times, we are using the overcorrection procedure.
...
Select one:
...
True
...
False
...
If we wish to have pleasant dreams after a nightmare has awakened us, we are best advised to __________________.
...
Select one:
...
a. go back to sleep quickly.
...
b. drink a glass of milk.
...
c. change our sleeping position as much as possible.
...
d. focus on pleasant thoughts.
...
In alleyway training, the last thing done by the rat on a trial is to stop at the goal box and eat. Yet, the rat does not stand still and make chewing movements when placed in the start box and alley the next day, as one might expect from Guthrie's theory, because _________________.
...
Select one:
...
a. the stimuli there are different from those in the goal box.
...
b. Guthrie was mistaken in his interpretation of reinforcement.
...
c. memory limitations prevent it.
...
d. rats typically run before eating.
...
In Guthrie's theory, Answer stimuli are stimuli which arise from the behavioral movements of the body and serve to bridge the time gap between the occurrence of an external stimulus and the completion of an appropriate and integrated behavioral act.
...
In Guthrie's theory, motivation or drive is represented by stimuli that are associated with behaviors appropriate to satisfying the specific drive or motive. Guthrie hypothesized that we are led to act appropriately by these stimuli, and that we continue to act in this way until these stimuli are removed. He called these stimuli Answer stimuli.
...
In the terminology of Guthrie's theory, Answer refers to the repetition of a movement or a series of movements in precisely the form that occurred when the individual was last in the same situation.
...
The least promising of Guthrie's methods for changing habits was sidetracking.
...
Select one:
...
True
...
False
...
"L F M N X" shows, as Guthrie believed, that redintegration may be quite specific.
...
Select one:
...
True
...
False
...
A mental patient angrily tears up a bed. As a consequence, he is required to remake that bed and to make twenty other beds as well. This would be an application of the Answer method changing a bad habit.
...
The method of Answer was Guthrie's method of changing behavior by introducing stimuli that produce a reaction incompatible with the undesired behavior in the presence of the other stimuli that occasion that behavior.
...
On Guthrie's theory, if the stimulus change that follows an action produces strong new reactions that are incompatible with the last actions in the presence of the old cues, the effect is Answer.incompatible stimuli
...
On Guthrie's theory, Davidson's contrafreeloading rats pressed a lever because ___________________.
...
Select one:
...
a. free food was present.
...
b. they had pressed the lever before.
...
c. they were hungry.
...
d. free food was present in the past, when they were hungry.
...
One of Guthrie's methods for changing habits, in the Answer method, cues that ordinarily produce unwanted reactions are presented in graded steps while another activity is occurring.
...
A person may repeatedly type "hte" to avoid misspelling "the." This exemplifies what Knight Dunlap called overcorrection.
...
Select one:
...
True
...
False
...
The principle of Answer Answer i is really the essence of Guthrie's theory, and he proposed that that word was a better term to describe his theory than was conditioning.
...
The problem with the schoolgirl who failed to hang up her coat lay in the fact that the mother had always been present when it was hung.
...
Select one:
...
True
...
False
...
The problem with the schoolgirl who failed to hang up her coat lay in the fact that the mother __________________.A
...
Select one:
...
a. had always been present when it was hung.XXX
...
b. had severely punished failure to hang it.
...
c. had actually rewarded failure to hang it.
...
d. was guilty of a failure to communicate.
...
Reading for pleasure and reading because of an exam are different because ___________________.
...
Select one:
...
a. what we are reading is different.
...
b. we have chosen to read in the first case.
...
c. our actual reading activity differs.
...
d. of pleasures and pains produced.
...
Repeatedly practicing a bad tennis backhand until we notice the cues that produce it so that we can practice the correct movements in the presence of those cues would be an example of Answer.
...
This school of British philosophy, represented by Locke, Berkeley, Hume, and later by James Mill and his son John Stuart Mill, stressed the importance of the association of ideas that occur in close spatial and temporal contiguity: Answer
...
Seward's test of Guthrie's theory of reinforcement actually shows effects of punishment.
...
Select one:
...
True
...
False
...
Seward's test of Guthrie's theory of reinforcement indeed shows that Guthrie was wrong.
...
Select one:
...
True
...
False
...
Species-specific defense reactions illustrate what Guthrie meant by punishment.
...
Select one:
...
True
...
A stimulus change that leaves some behavior as the last thing done in a situation is called a punishment.
...
Select one:
...
True
...
False
...
To say that we do in a situation what we did the last time that we were in that situation is to say that our activity shows learning.
...
Select one:
...
True
...
False
...
To say that we do in a situation what we did the last time that we were in that situation is to say that our activity shows _________________.
...
Select one:
...
a. docility.
...
b. purpose.
...
c. learning.
...
d. stereotypy.
...
The term Answer
...
Answer
...
is used in the history of the psychology of learning to refer to one of the two classic views on the time course of learning, where it refers to the belief that the strength of learning changes gradually over time with the strength at one point in time blending into a slightly greater strength at a later point in time.
...
Voeks and Estes showed that Guthrie may be correct and that learning does occur in one trial.
...
Select one:
...
True
...
False
...
The effect of rewards and punishers, said Guthrie, is ____________________.
...
Select one:
...
a. dependent on the pleasure or pain they produce.
...
b. dependent on their frequency and predict ability.
...
c. dependent on what they, as stimuli, make an individual do.
...
d. dependent on what they make the average individual do.
...
Feedback
...
The utterance "LFMNX" would most likely mean something to a waitress in a restaurant that specializes in ham-and-egg breakfasts, according to Guthrie, because of the process he called _____________________________.
...
Select one:
...
a. intention.
...
b. toleration.
...
c. associative interference.
...
d. redintegration.
...
Voeks, Estes, and Guthrie emphasized that a true picture of learning may be obtained only if data ___________________.
...
Select one:
...
a. are averaged over many subjects.
...
b. are averaged over many days but only for individual subjects.
...
c. represent single trials averaged over many subjects.
...
d. represent single trials for individual subjects.
...
Watson called it counterconditioning and Wolpe used it in systematic desensitization. Guthrie's name for the method is _________________.
...
Select one:
...
a. exhaustion.
...
b. incompatible stimuli.
...
c. toleration.
...
d. sidetracking.
...
Watson usually claimed that all thinking and perceiving could be reduced to muscular movement. Guthrie's opinion on this was ___________________.
...
Select one:
...
a. completely in disagreement with Watson.
...
b. identical to Watson's.
...
c. similar, though he was less certain.
...
d. irrelevant to Watson's.
...
When a rat learns a maze for food "reward," reward occurs many times throughout the maze.
...
Select one:
...
True
...
False
...
When Guthrie spoke of "situations" he meant ___________________.
...
Select one:
...
a. external environments that we find ourselves in.
...
b. external environments that we have recently been in.
...
c. stimuli both outside us and inside our body.
...
d. aspects of our environments that we specifically identify.
...
When we study even though we are very tired, we may learn to read without following.
...
Select one:
...
True
...
False
...
When we study even though we are very tired, we may __________________.A
...
Select one:
...
a. learn to read without following.XX
...
b. think that we are learning, but we are not.
...
c. actually learn as well as when refreshed.
...
d. illustrate the principle of negative practice.
...
Which of the following would Guthrie not view as a motive or drive?
...
Select one:
...
a. hunger
...
b. boredom
...
c. pleasure
...
d. pain
...
The word Answer simply means "nearness," and can refer to nearness in time or space or both (in fact, usually both). Guthrie's theory emphasizes the importance of nearness in time between stimuli and responses, and therefore this word is also used to characterize his theory. Contiguity
...