-
A Review of B. F. Skinner's
Verbal Behavior
Noam Chomsky
In Leon A.
Jakobovits and Murray S. Miron (eds.),
Readings in the Psychology of
Language
, Prentice-Hall,
1967, pp. 142-143
Preface
Rereading this review after eight
years, I find little of substance that I would
change if I were to write it today. I am not
aware of any theoretical or
experimental work that challenges its conclusions;
nor, so far as I know, has there been any
attempt to meet the criticisms that are
raised in the review or to show that they are
erroneous or ill-founded.
I had
intended this review not specifically as a
criticism of Skinner's speculations regarding
language, but rather as a
more
general
critique of
behaviorist
(I
would
now
prefer
to
say
as
to
the
nature
of
higher
mental
processes.
My
reason
for
discussing
Skinner's
book
in
such
detail
was
that
it
was
the
most
careful
and
thoroughgoing presentation of such
speculations, an evaluation that I feel is still
accurate. Therefore, if the conclusions
I attempted to substantiate in the
review are correct, as I believe they are, then
Skinner's work can be regarded as, in
effect, a reductio ad absurdum of
behaviorist assumptions. My personal view is that
it is a definite merit, not a defect,
of
Skinner's
work
that
it
can
be
used
for
this
purpose,
and
it
was
for
this
reason
that
I
tried
to
deal
with
it
fairly
exhaustively. I do
not see how his proposals can be improved upon,
aside from occasional details and oversights,
within
the framework of the general
assumptions that he accepts. I do not, in other
words, see any way in which his proposals
can
be
substantially
improved
within
the
general
framework
of
behaviorist
or
neobehaviorist,
or,
more
generally,
empiricist
ideas
that
has
dominated
much
of
modern
linguistics,
psychology,
and
philosophy.
The
conclusion
that
I
hoped to establish in the review, by
discussing these speculations in their most
explicit and detailed form, was that the
general point of view was largely
mythology, and that its widespread acceptance is
not the result of empirical support,
persuasive reasoning, or the absence of
a plausible alternative.
If I were
writing today on the same topic, I would try to
make it more clear than I did that I was
discussing Skinner's
proposals as a
paradigm example of a futile tendency in modern
speculation about language and mind. I would also
be
somewhat less apologetic and
hesitant about proposing the alternative view
sketched in Sections 5 and 11 -- and also
less
ahistorical
in
proposing
this
alternative,
since
in
fact
it
embodies
assumptions
that
are
not
only
plausible
and
relatively
well-confirmed,
so
it
appears
to
me,
but
also
deeply
rooted
in
a
rich
and
largely
forgotten
tradition
of
rationalist psychology and linguistics.
I have tried to correct this imbalance in later
publications (Chomsky, 1962, 1964,
1966; see also Miller et al., 1960;
Katz and Postal, 1964; Fodor, 1965; Lenneberg,
1966).
I think it would also have been
valuable to try to sketch some of the reasons --
and there were many -- that have made
the
view
I
was
criticizing
seem
plausible
over
a
long
period,
and
also
to discuss the
reasons for
the
decline
of
the
alternative rationalist
conception which, I was suggesting, should be
rehabilitated. Such a discussion would, perhaps,
have helped to place the specific
critique of Skinner in a more meaningful context.
References in the Preface
Chomsky, N.,
Suppes, and A.
Tarski. Stanford; Calif.: Stanford University
Press, 1962.
----------, Current Issues
in Linguistic Theory. The Hague: Mouton and Co.,
1964.
----------, Cartesian
Linguistics. New York: Harper and Row, Publishers,
1966.
Fodor, J.,
Katz, J.
and P. Postal, An Integrated Theory of Linguistic
Description. Cambridge, Mass: M.I.T. Press, 1964.
Lenneberg, E., Biological Bases of
Language. (In press.)
Miller, G. A., E.
Galanter, and K. H. Pribram, Plans
and
the Structure of Behavior. New York: Holt,
Rhinehart, and
Winston, Inc., 1960.
The Review
by
Noam Chomsky
I
A great many
linguists
and philosophers concerned
with
language
have expressed the
hope
that
their studies
might
ultimately be embedded
in a framework provided by behaviorist psychology,
and that refractory areas of investigation,
particularly those in which meaning is
involved, will in this way be opened up to
fruitful exploration. Since this volume is
the first large-scale attempt to
incorporate the major aspects of linguistic
behavior within a behaviorist framework, it
merits and will undoubtedly receive
careful attention. Skinner is noted for his
contributions
to the study of animal
behavior. The book under review is the
product of study of linguistic behavior extending
over more than twenty years.
Earlier
versions
of
it
have
been
fairly
widely
circulated,
and
there
are
quite
a
few
references
in
the
psychological
literature to
its major ideas.
The problem to which
this book is addressed is that of giving a
analysis, Skinner means identification
of the variables that control this behavior and
specification of how they interact
to
determine a particular verbal response.
Furthermore, the controlling variables are to be
described completely in
terms of such
notions as stimulus, reinforcement, deprivation,
which have been given a reasonably clear meaning
in
animal experimentation. In other
words, the goal of the book is to provide a way to
predict and control verbal behavior
by
observing and manipulating the physical
environment of the speaker.
Skinner
feels that recent advances in the laboratory study
of animal behavior permit us to approach this
problem with a
certain optimism, since
fairly well understood ... the results
[of this experimental work] have been surprisingly
free of species restrictions.
Recent
work has shown that the methods can be extended to
human behavior without serious
modification
It
is
important
to
see
clearly
just
what
it
is
in
Skinner's
program
and
claims
that
makes
them
appear
so
bold
and
remarkable, It is not primarily the
fact that he has set functional analysis as his
problem, or that he limits himself to
study of observables, i.e., input-
output relations. What is so surprising is the
particular limitations he has imposed on
the way in which the observables of
behavior are to be studied, and, above all, the
particularly simple nature of the
function
which,
he
claims,
describes
the
causation
of
behavior.
One
would
naturally
expect
that
prediction
of
the
behavior
of
a
complex
organism
(or
machine)
would
require, in
addition
to
information
about
external stimulation,
knowledge of the internal structure of
the organism, the ways in which it processes input
information and organizes its
own
behavior.
These
characteristics
of
the
organism
are
in
general
a
complicated
product
of
inborn
structure,
the
genetically determined course of
maturation, and past experience. Insofar as
independent neurophysiological evidence
is not available, it is obvious that
inferences concerning the structure of the
organism are based on observation of
behavior and outside events.
Nevertheless, one's estimate of the relative
importance of external factors and internal
structure in the determination of
behavior will have an important effect on the
direction of research on linguistic (or
any other) behavior, and on the kinds
of analogies from animal behavior studies that
will be considered relevant or
suggestive.
Putting it
differently, anyone who sets himself the problem
of analyzing the causation of behavior will (in
the absence of
independent
neurophysiological evidence) concern himself with
the only data available, namely the record of
inputs to
the organism and the
organism's present response, and will try to
describe the function specifying the response in
terms
of
the history of
inputs. This
is nothing more than the
definition of his problem. There are no possible
grounds for
argument
here,
if
one
accepts
the
problem
as
legitimate,
though
Skinner
has
often
advanced
and
defended
this
definition of a problem as if it were a
thesis which other investigators reject. The
differences that arise between those
who
affirm
and
those
who
deny
the
importance
of
the
specific
of
the
organism
to
learning
and
performance
concern
the
particular
character
and
complexity
of
this
function,
and
the
kinds
of
observations
and
research necessary for
arriving at a precise specification of it. If the
contribution of the organism is complex, the only
hope of predicting behavior even in a
gross way will be through a very indirect program
of research that begins by
studying the
detailed character of the behavior itself and the
particular capacities of the organism involved.
Skinner's thesis is that external
factors consisting of present stimulation and the
history of reinforcement (in particular,
the frequency,
arrangement,
and
withholding
of reinforcing
stimuli)
are
of
overwhelming
importance, and
that
the
general
principles
revealed
in
laboratory
studies
of
these
phenomena
provide
the
basis
for
understanding
the
complexities
of
verbal
behavior.
He
confidently
and
repeatedly
voices
his
claim
to
have
demonstrated
that
the
contribution of the
speaker is quite trivial and elementary, and that
precise prediction of verbal behavior involves
only
specification of the few external
factors that he has isolated experimentally with
lower organisms.
Careful study of this
book (and of the research on which it draws)
reveals, however, that these astonishing claims
are far
from
justified.
It
indicates,
furthermore,
that
the
insights
that
have
been
achieved
in
the
laboratories
of
the
reinforcement theorist,
though quite genuine, can be applied to complex
human behavior only in the most gross and
superficial
way,
and
that
speculative
attempts
to
discuss
linguistic
behavior
in
these
terms
alone
omit
from
consideration
factors
of
fundamental
importance
that
are,
no
doubt,
amenable
to
scientific
study,
although
their
specific character cannot at present be
precisely formulated. Since Skinner's work is the
most extensive attempt to
accommodate
human behavior involving higher mental faculties
within a strict behaviorist schema of the type
that has
attracted
many
linguists
and
philosophers,
as
well
as
psychologists,
a
detailed
documentation
is
of
independent
interest. The magnitude of the failure
of this attempt to account for verbal behavior
serves as a kind of measure of the
importance
of
the
factors
omitted
from
consideration,
and
an
indication
of
how
little
is
really
known
about
this
remarkably complex phenomenon.
The force of Skinner's argument lies in
the enormous wealth and range of examples for
which he proposes a functional
analysis. The only way to evaluate the
success of his program and the correctness of his
basic assumptions about verbal
behavior
is to review these examples in detail and to
determine the precise character of the concepts in
terms of which
the functional analysis
is presented. Section 2 of this review describes
the experimental context with respect to which
these
concepts
are
originally
defined.
Sections
3
and
4
deal
with
the
basic
concepts
--
stimulus,
response,
and
reinforcement, Sections
6 to 10 with the new descriptive machinery
developed specifically for the description of
verbal
behavior. In Section 5 we
consider the status of the fundamental claim,
drawn from the laboratory, which serves as the
basis for the analogic guesses about
human behavior that have been proposed by many
psychologists. The final section
(Section
11)
will
consider
some
ways
in
which
further
linguistic
work
may
play
a
part
in
clarifying
some
of
these
problems.
II
Although this book makes no direct
reference to experimental work, it can be
understood only in terms of the general
framework that Skinner has developed
for the description of behavior. Skinner divides
the responses of the animal into
two
main
categories.
Respondents
are
purely
reflex
responses
elicited
by
particular
stimuli.
Operants
are
emitted
responses,
for
which
no
obvious
stimulus
can
be
discovered.
Skinner
has
been
concerned
primarily
with
operant
behavior. The experimental arrangement
that he introduced consists basically of a box
with a bar attached to one wall
in such
a way that when the bar is pressed, a food pellet
is dropped into a tray (and the bar press is
recorded). A rat
placed in the box will
soon press the bar, releasing a pellet into the
tray. This state of affairs, resulting from the
bar
press, increases the strength of
the bar-pressing operant. The food pellet is
called a reinforcer; the event, a reinforcing
event. The strength of an operant is
defined by Skinner in terms of the rate of
response during extinction (i.e, after the
last reinforcement and before return to
the pre-conditioning rate).
Suppose
that release of the pellet is conditional on the
flashing of a light. Then the rat will come to
press the bar only
when the light
flashes. This is called stimulus discrimination.
The response is called a discriminated operant and
the
light is called the occasion for
its emission: this is to be distinguished from
elicitation of a response by a stimulus in the
case of the respondent.2 Suppose that
the apparatus is so arranged that bar-pressing of
only a certain character (e.g.,
duration) will release the pellet. The
rat will then come to press the bar in the
required way. This process is called
response differentiation. By successive
slight changes in the conditions under which the
response will be reinforced, it is
possible to shape the response of a rat
or a pigeon in very surprising ways in a very
short time, so that rather complex
behavior can be produced by a process
of successive approximation.
A
stimulus can
become
reinforcing by
repeated
association
with
an
already
reinforcing stimulus.
Such
a stimulus
is
called a secondary reinforcer. Like
many contemporary behaviorists, Skinner considers
money, approval, and the like to
be
secondary
reinforcers
which
have
become
reinforcing
because
of
their
association
with
food,
etc.3
Secondary
reinforcers can be
generalized by associating them with a variety of
different primary reinforcers.
Another
variable that can affect the rate of the bar-
pressing operant is drive, which Skinner defines
operationally in
terms
of
hours
of
deprivation.
His
major
scientific
book,
Behavior
of
Organisms,
is
a
study
of
the
effects
of
food-
deprivation
and
conditioning
on
the
strength
of
the
bar-
pressing
response
of
healthy
mature
rats.
Probably
Skinner's most original contribution to
animal behavior studies has been his investigation
of the effects of intermittent
reinforcement, arranged in various
different ways, presented in Behavior of Organisms
and extended (with pecking of
pigeons
as the operant under investigation) in the recent
Schedules of Reinforcement by Ferster and Skinner
(1957). It is
apparently
these
studies
that
Skinner
has
in
mind
when
he
refers
to
the
recent
advances
in
the
study
of
animal
behavior.4
The notions
stimulus, response, reinforcement are relatively
well defined with respect to the bar-pressing
experiments
and others similarly
restricted. Before we can extend them to real-life
behavior, however, certain difficulties must be
faced. We must decide, first of all,
whether any physical event to which the organism
is capable of reacting is to be
called
a stimulus on a given occasion, or only one to
which the organism in fact reacts; and
correspondingly, we must
decide whether
any part of behavior is to be called a response,
or only one connected with stimuli in lawful ways.
Questions
of
this
sort
pose
something
of
a
dilemma
for
the
experimental
psychologist.
If
he
accepts
the
broad
definitions, characterizing any
physical event impinging on the organism as a
stimulus and any part of the organism's
behavior as a response, he must
conclude that behavior has not been demonstrated
to be lawful. In the present state of
our knowledge, we must attribute an
overwhelming influence on actual behavior to ill-
defined factors of attention, set,
volition,
and
caprice.
If
we
accept
the
narrower
definitions,
then
behavior
is
lawful
by
definition
(if
it
consists
of
responses); but this fact is of limited
significance, since most of what the animal does
will simply not be considered
behavior.
Hence, the psychologist either must admit that
behavior is not lawful (or that he cannot at
present show that
it is -- not at all a
damaging admission for a developing science), or
must restrict his attention to those highly
limited
areas
in
which
it
is
lawful
(e.g.,
with
adequate
controls,
bar-pressing
in
rats;
lawfulness
of
the
observed
behavior
provides, for Skinner, an implicit
definition of a good experiment).
Skinner does not consistently adopt
either course. He utilizes the experimental
results as evidence for the scientific
character
of
his
system
of
behavior,
and
analogic
guesses
(formulated
in
terms
of
a
metaphoric
extension
of
the
technical
vocabulary of the laboratory) as evidence for its
scope. This creates the illusion of a rigorous
scientific theory
with a very broad
scope, although in fact the terms used in the
description of real-life and of laboratory
behavior may
be mere homonyms, with at
most a vague similarity of meaning. To
substantiate this evaluation, a critical account
of his
book must show that with a
literal reading (where the terms of the
descriptive system have something like the
technical
meanings
given
in
Skinner's
definitions)
the
book
covers
almost
no
aspect
of
linguistic
behavior,
and
that
with
a
metaphoric reading, it is no more
scientific than the traditional approaches to this
subject matter, and rarely as clear
and
careful.5
III
Consider first
Skinner's use of the notions stimulus and
response. In Behavior of Organisms (9) he commits
himself to the
narrow definitions for
these terms. A part of the environment and a part
of behavior are called stimulus (eliciting,
discriminated, or reinforcing) and
response, respectively, only if they are lawfully
related; that is, if the dynamic laws
relating them show smooth and
reproducible curves. Evidently, stimuli and
responses, so defined, have not been shown
to figure very widely in ordinary human
behavior.6 We can, in the face of presently
available evidence, continue to
maintain
the
lawfulness
of
the
relation
between
stimulus
and
response
only
by
depriving
them
of
their
objective
character.
A
typical
example
of
stimulus
control
for
Skinner
would
be
the
response
to
a
piece
of
music
with
the
utterance Mozart or to a painting with
the response Dutch. These responses are asserted
to be
extremely subtle
properties
with the wallpaper, I thought
you liked abstract work, Never saw it before,
Tilted, Hanging too low, Beautiful, Hideous,
Remember our camping trip last summer?,
or whatever else might come into our minds when
looking at a picture (in
Skinnerian
translation, whatever other responses exist in
sufficient strength). Skinner could only say that
each of these
responses is under the
control of some other stimulus property of the
physical object. If we look at a red chair and say
red, the response is under the control
of the stimulus redness; if we say chair, it is
under the control of the collection of
properties (for Skinner, the object)
chairness (110), and similarly for any other
response. This device is as simple as it is
empty.
Since
properties
are
free
for
the
asking
(we
have
as
many
of
them
as
we
have
nonsynonymous
descriptive
expressions in
our language, whatever this means exactly), we can
account for a wide class of responses in terms of
Skinnerian functional analysis by
identifying the controlling stimuli. But the word
stimulus has lost all objectivity in this
usage. Stimuli are no longer part of
the outside physical world; they are driven back
into the organism. We identify the
stimulus when we hear the response. It
is clear from such examples, which abound, that
the talk of stimulus control
simply
disguises a complete retreat to mentalistic
psychology. We cannot predict verbal behavior in
terms of the stimuli
in the speaker's
environment, since we do not know what the current
stimuli are until he responds. Furthermore, since
we cannot control the property of a
physical object to which an individual will
respond, except in highly artificial cases,
Skinner's claim that his system, as
opposed to the traditional one, permits the
practical control of verbal behavior7 is
quite false.
Other examples
of stimulus control merely add to the general
mystification. Thus, a proper noun is held to be a
response
and Moscow, which I
presume are proper nouns if anything is, but have
never been stimulated by the corresponding
objects. How can this fact be made
compatible with this definition? Suppose that I
use the name of a friend who is not
present. Is this an instance of a
proper noun under the control of the friend as
stimulus? Elsewhere it is asserted that a
stimulus controls a response in the
sense that presence of the stimulus increases the
probability of the response. But it
is
obviously untrue that the probability that a
speaker will produce a full name is increased when
its bearer faces the
speaker.
Furthermore, how can one's own name be a proper
noun in this sense?
A
multitude
of
similar
questions
arise
immediately.
It
appears
that
the
word
control
here
is
merely
a
misleading
paraphrase for the traditional denote
or refer. The assertion (115) that so far as the
speaker is concerned, the relation
of
reference is
having specified
properties
sense. That they are not
intended to be taken literally is indicated by
many examples, as when a response is said to be
by
a
situation
or
state
of
affairs
as
Thus,
the
expression
a
needle
in
a
haystack
be
controlled as a unit by a particular
type of situation
under the control of a
single set of subtle properties of stimuli (121);
control of an extremely complex
stimulus situation
under the control of
a state of affairs which might also control He is
ailing
foreign country and reports upon
his return, his report is under
war may
be a response to a
of stimuli which we
speak of as action-in-the-past
specific
features
of
the
situation
as
its
(332).
No
characterization
of
the
notion
stimulus
control
that
is
remotely
related to the bar-pressing experiment (or that
preserves the faintest objectivity) can be made to
cover a set
of
examples
like
these,
in
which,
for
example,
the
controlling
stimulus
need
not
even
impinge
on
the
responding
organism.
Consider now Skinner's use of the
notion response. The problem of identifying units
in verbal behavior has of course
been a
primary concern of linguists, and it seems very
likely that experimental psychologists should be
able to provide
much-needed assistance
in clearing up the many remaining difficulties in
systematic identification. Skinner recognizes
(20) the fundamental character of the
problem of identification of a unit of verbal
behavior, but is satisfied with an
answer so vague and subjective that it
does not really contribute to its solution. The
unit of verbal behavior -- the verbal
operant
--
is
defined
as
a
class
of
responses
of
identifiable
form
functionally
related
to
one
or
more
controlling
variables. No method is suggested for
determining in a particular instance what are the
controlling variables, how many
such
units have occurred, or where their boundaries are
in the total response. Nor is any attempt made to
specify how
much or what kind of
similarity in form or control is required for two
physical events to be considered instances of the
same operant. In short, no answers are
suggested for the most elementary questions that
must be asked of anyone
proposing a
method for description of behavior. Skinner is
content with what he calls an extrapolation of the
concept of
operant developed in the
laboratory to the verbal field. In the typical
Skinnerian experiment, the problem of identifying
the unit of behavior is not too
crucial. It is defined, by fiat, as a recorded
peck or bar-press, and systematic variations in
the
rate
of
this
operant
and
its
resistance
to
extinction
are
studied
as
a
function
of
deprivation
and
scheduling
of
reinforcement
(pellets).
The
operant
is
thus
defined
with
respect
to
a
particular
experimental
procedure.
This
is
perfectly
reasonable
and
has
led
to
many
interesting
results.
It
is,
however,
completely
meaningless
to
speak
of
extrapolating
this
concept
of
operant
to
ordinary
verbal
behavior.
Such
leaves
us
with
no
way
of
justifying one or another
decision about the units in the
Skinner
specifies
bar-pressing
experiment,
response
strength
is
defined
in
terms
of
rate
of
emission
during
extinction.
Skinner
has
argued8 that this is
relevant to the 'learning
process.'
(22). This definition provides
a comforting impression of objectivity, which,
however, is quickly dispelled when we look
into the matter more closely. The term
probability has some rather obscure meaning for
Skinner in this book.9 We are
told,
on
the
one
hand,
that
evidence
for
the
contribution
of
each
variable
[to
response
strength]
is
based
on
observation of frequencies
alone
strength, since, for example, the
frequency of a response may be
of
controlling variables
frequency of
occurrence of its controlling variables if we
accept Skinner's view that the behavior occurring
in a given
situation is
the
contribution of each variable to response strength
is based on observation of frequencies alone, it
turns out that
base
the
notion
of
strength
upon
several
kinds
of
evidence
(22),
in
particular
(22-28):
emission
of
the
response
(particularly in
unusual circumstances), energy level (stress),
pitch level, speed and delay of emission, size of
letters
etc. in writing, immediate
repetition, and -- a final factor, relevant but
misleading -- over-all frequency.
Of
course,
Skinner
recognizes
that
these
measures
do
not
co-vary,
because
(among
other
reasons)
pitch,
stress,
quantity, and reduplication may have
internal linguistic functions.10 However, he does
not hold these conflicts to be
very
important, since the proposed factors indicative
of strength are
For example,
be lost on the
owner.
Beautiful
in
a
loud,
high-
pitched
voice,
repeatedly,
and
with
no
delay
(high
response
strength).
It
may
be
equally
effective
to
look
at
the picture
silently
(long
delay)
and
then
to
murmur
Beautiful
in
a soft,
low-pitched
voice
(by
definition, very low
response strength).
It is not unfair, I
believe, to conclude from Skinner's discussion of
response strength, the basic datum in functional
analysis, that his extrapolation of the
notion of probability can best be interpreted as,
in effect, nothing more than a
decision
to use the word probability, with its favorable
connotations of objectivity, as a cover term to
paraphrase such
low-status words as
interest, intention, belief, and the like. This
interpretation is fully justified by the way in
which
Skinner uses the terms
probability and strength. To cite just one
example, Skinner defines the process of confirming
an
assertion in science as one of
strength (425-29). If we take this
suggestion quite literally, the degree of
confirmation of a scientific assertion can be
measured
as
a
simple
function
of
the
loudness,
pitch,
and
frequency
with
which
it
is
proclaimed,
and
a
general
procedure for
increasing its degree of confirmation would be,
for instance, to train machine guns on large
crowds of
people who have been
instructed to shout it. A better indication of
what Skinner probably has in mind here is given by
his description of how the theory of
evolution, as an example, is confirmed. This
made more plausible -- is strengthened
-- by several types of construction
based upon verbal responses in geology,
paleontology, genetics, and so
on
as paraphrases of more familiar
locutions such as
Similar latitude of
interpretation is presumably expected when we read
that
turn for what we may call the
listener's 'belief'
or identical with,
our tendency to act upon the verbal stimuli which
he provides
I think it is evident, then,
that Skinner's use of the terms stimulus, control,
response, and strength justify the general
conclusion stated in the last paragraph
of Section 2. The way in which these terms are
brought to bear on the actual
data
indicates that we must interpret them as mere
paraphrases for the popular vocabulary commonly
used to describe
behavior and as having
no particular connection with the homonymous
expressions used in the description of laboratory
experiments. Naturally, this
terminological revision adds no objectivity to the
familiar mentalistic mode of description.
IV
The other fundamental
notion borrowed from the description of bar-
pressing experiments is reinforcement. It raises
problems which are similar, and even
more serious. In Behavior of Organisms,
as the presentation of a certain kind
of stimulus in a temporal relation with either a
stimulus or response. A reinforcing
stimulus is defined as such by its
power to produce the resulting change [in
strength]. There is no circularity about this:
some stimuli are found to produce the
change, others not, and they are classified as
reinforcing and nonreinforcing
accordingly
useless, however,
in the discussion of real-life behavior, unless we
can somehow characterize the stimuli which are
reinforcing (and the situations and
conditions under which they are reinforcing).
Consider first of all the status of the
basic principle that Skinner calls the
followed by presence of a reinforcing
stimulus, the strength is increased
was
defined, this law becomes a tautology.13 For
Skinner, learning is just change in response
strength.14 Although the
statement that
presence of reinforcement is a sufficient
condition for learning and maintenance of behavior
is vacuous,
the
claim
that
it
is
a
necessary
condition
may
have
some
content,
depending
on
how
the
class
of
reinforcers
(and
appropriate situations) is
characterized. Skinner does make it very clear
that in his view reinforcement is a necessary
condition for language learning and for
the continued availability of linguistic responses
in the adult.15 However, the
looseness
of the term reinforcement as Skinner uses it in
the book under review makes it entirely pointless
to inquire
into the truth or falsity of
this claim. Examining the instances of what
Skinner calls reinforcement, we find that not even
the requirement that a reinforcer be an
identifiable stimulus is taken seriously. In fact,
the term is used in such a way
that the
assertion that reinforcement is necessary for
learning and continued availability of behavior is
likewise empty.
To
show
this,
we
consider
some
examples
of
reinforcement.
First
of
all,
we
find
a
heavy
appeal
to
automatic
self-
reinforcement,
Thus,
man
talks
to
himself...
because
of
the
reinforcement
he
receives
(163);
child
is
reinforced automatically when he
duplicates the sounds of airplanes, streetcars
...
nursery may automatically reinforce
his own exploratory verbal behavior when he
produces sounds which he has heard
in
the speech of others
response'
and
is
reinforced
thereby
(68);
thinking
is
which
automatically
affects
the
behaver
and
is
reinforcing because it
does so
verbal fantasy, whether overt or
covert, is automatically reinforcing to the
speaker as listener. Just as the musician
plays or composes what he is reinforced
by hearing, or as the artist paints what
reinforces him visually, so the speaker
engaged
in
verbal
fantasy
says
what
he
is
reinforced
by
hearing
or
writes
what
he
is
reinforced
by
reading
(439);
similarly, care in problem solving, and
rationalization, are automatically self-
reinforcing (442-43). We can also reinforce
someone by emitting verbal behavior as
such (since this rules out a class of aversive
stimulations, 167), by not emitting
verbal behavior (keeping silent and
paying attention, 199), or by acting appropriately
on some future occasion (152:
strength
of [the speaker's] behavior is determined mainly
by the behavior which the listener will exhibit
with respect to
a given state of
affairs
most such cases, of course, the
speaker is not present at the time when the
reinforcement takes place, as when
artist...is reinforced by the effects
his works have upon... others
that his
may not be reinforced often or
immediately, but his net reinforcement may be
great
news, or to publish an
experimental result which upsets the theory of a
rival (154), to describe circumstances which
would be reinforcing if they were to
occur (165), to avoid repetition (222), to
not mentioned or to hear nonexistent
words in his child's babbling (259), to clarify or
otherwise intensify the effect of a
stimulus which serves an important
discriminative function (416), and so on.
From this sample, it can be seen that
the notion of reinforcement has totally lost
whatever objective meaning it may
ever
have had. Running through these examples, we see
that a person can be reinforced though he emits no
response at
all, and that the
reinforcing stimulus need not impinge on the
reinforced person or need not even exist (it is
sufficient
that it be imagined or hoped
for). When we read that a person plays what music
he likes (165), says what he likes (165),
thinks what he likes (438-39), reads
what books he likes (163), etc., BECAUSE he finds
it reinforcing to do so, or that we
write books or inform others of facts
BECAUSE we are reinforced by what we hope will be
the ultimate behavior of
reader or
listener, we can only conclude that the term
reinforcement has a purely ritual function. The
phrase
reinforced by Y (stimulus, state
of affairs, event, etc.)
wishes that Y
were the case,
paraphrase introduces any
new clarity or objectivity into the description of
wishing, liking, etc., is a serious delusion.
The only effect is to obscure the
important differences among the notions being
paraphrased. Once we recognize the
latitude with which the term
reinforcement is being used, many rather startling
comments lose their initial effect -- for
instance, that the behavior of the
creative artist is
What has been hoped
for from the psychologist is some indication how
the casual and informal description of everyday
behavior in the popular vocabulary can
be explained or clarified in terms of the notions
developed in careful experiment
and
observation, or perhaps replaced in terms of a
better scheme. A mere terminological revision, in
which a term
borrowed from the
laboratory is used with the full vagueness of the
ordinary vocabulary, is of no conceivable
interest.
It seems that Skinner's claim
that all verbal behavior is acquired and
maintained in
quite empty, because his
notion of reinforcement has no clear content,
functioning only as a cover term for any factor,
detectable or not, related to
acquisition or maintenance of verbal behavior.16
Skinner's use of the term conditioning
suffers
from
a
similar
difficulty.
Pavlovian
and
operant
conditioning
are
processes
about
which
psychologists
have
developed
real
understanding.
Instruction
of
human
beings
is
not.
The
claim
that
instruction
and
imparting
of
information
are
simply
matters
of
conditioning
(357-66)
is
pointless.
The
claim
is
true,
if
we
extend
the
term
conditioning to cover these processes,
but we know no more about them after having
revised this term in such a way as
to
deprive it of its relatively clear and objective
character. It is, as far as we know, quite false,
if we use conditioning in
its literal
sense. Similarly, when we say that
one
term to another or from one object to
another
true of the predication Whales
are mammals? Or, to take Skinner's example, what
point is there in saying that the effect
of The telephone is out of order on the
listener is to bring behavior formerly controlled
by the stimulus out of order
under
control of the stimulus telephone (or the
telephone itself) by a process of simple
conditioning (362)? What laws of
conditioning hold in this case?
Furthermore, what behavior is controlled by the
stimulus out of order, in the abstract?
Depending on the object of which this
is predicated, the present state of motivation of
the listener, etc., the behavior
may
vary from rage to pleasure, from fixing the object
to throwing it out, from simply not using it to
trying to use it in
the normal way
(e.g., to see if it is really out of order), and
so on. To speak of
available behavior
under control of a new stimulus
V
The claim that careful arrangement of
contingencies of reinforcement by the verbal
community is a necessary condition
for
language-learning
has
appeared,
in
one
form
or
another,
in
many
places.17
Since
it
is
based
not
on
actual
observation, but on
analogies to laboratory study of lower organisms,
it is important to determine the status of the
underlying assertion within
experimental psychology proper. The most common
characterization of reinforcement (one
which
Skinner
explicitly
rejects,
incidentally)
is
in
terms
of
drive
reduction.
This
characterization
can
be
given
substance by defining drives in some
way independently of what in fact is learned. If a
drive is postulated on the basis of
the
fact that learning takes place, the claim that
reinforcement is necessary for learning will again
become as empty as
it is in the
Skinnerian framework. There is an extensive
literature on the question of whether there can be
learning
without drive reduction
(latent learning). The
maze
without
reward
showed
a
marked
drop
in
number
of
errors
(as
compared
to
a
control
group
which
had
not
explored the maze) upon introduction of
a food reward, indicating that the rat had learned
the structure of the maze
without
reduction of the hunger drive. Drive-reduction
theorists countered with an exploratory drive
which was reduced
during the pre-reward
learning, and claimed that a slight decrement in
errors could be noted before food reward. A
wide variety of experiments, with
somewhat conflicting results, have been carried
out with a similar design.18 Few
investigators still doubt the existence
of the phenomenon, E. R. Hilgard, in his general
review of learning theory,19
concludes
that
is
no
longer
any
doubt
but
that,
under
appropriate
circumstances,
latent
learning
is
demonstrable.
More recent
work has shown that novelty and variety of
stimulus are sufficient to arouse curiosity in the
rat and to
motivate it to explore
(visually), and in fact, to learn (since on a
presentation of two stimuli, one novel, one
repeated,
the rat will attend to the
novel one),20 that rats will learn to choose the
arm of a single-choice maze that leads to a
complex maze, running through this
being their only
maintain their
performance at a high level of efficiency with
visual exploration (looking out of a window for 30
seconds)
as the only reward22 and,
perhaps most strikingly of all, that monkeys and
apes will solve rather complex manipulation
problems
that
are
simply
placed
in
their
cages,
and
will
solve
discrimination
problems
with
only
exploration
and
manipulation as incentives.23 In these
cases, solving the problem is apparently its own
be handled by reinforcement theorists
only if they are willing to set up curiosity,
exploration, and manipulation drives,
or to speculate somehow about acquired
drives24 for which there is no evidence outside of
the fact that learning takes
place in
these cases.
There
is
a
variety
of
other
kinds
of
evidence
that
has
been
offered
to
challenge
the
view
that
drive
reduction
is
necessary
for
learning.
Results
on
sensory-sensory
conditioning
have
been
interpreted
as
demonstrating
learning
without drive
reduction.25 Olds has reported reinforcement by
direct stimulation of the brain, from which he
concludes
that reward need not satisfy
a physiological need or withdraw a drive
stimulus.26 The phenomenon of imprinting, long
observed by zoologists, is of
particular interest in this connection. Some of
the most complex patterns of behavior of
birds, in particular, are directed
towards objects and animals of the type to which
they have been exposed at certain
critical early periods of life.27
Imprinting is the most striking evidence for the
innate disposition of the animal to learn
in a certain direction and to react
appropriately to patterns and objects of certain
restricted types, often only long after
the
original
learning
has
taken
place.
It
is,
consequently,
unrewarded
learning,
though
the
resulting
patterns
of
behavior may be refined
through reinforcement. Acquisition of the typical
songs of song birds is, in some cases, a type of
imprinting. Thorpe reports studies that
show
earliest youth, before the bird
itself is able to produce any kind of full
song.
recently been investigated under
laboratory conditions and controls with positive
results.29
Phenomena of this general
type are certainly familiar from everyday
experience. We recognize people and places to
which we have given no particular
attention. We can look up something in a book and
learn it perfectly well with no
other
motive than to confute reinforcement theory, or
out of boredom, or idle curiosity. Everyone
engaged in research
must have had the
experience of working with feverish and prolonged
intensity to write a paper which no one else will
read or to solve a problem which no one
else thinks important and which will bring no
conceivable reward -- which may
only
confirm a general opinion that the researcher is
wasting his time on irrelevancies. The fact that
rats and monkeys
do
likewise
is
interesting
and
important
to
show
in
careful
experiment.
In
fact,
studies
of
behavior
of
the
type
mentioned
above
have
an
independent
and
positive
significance
that
far
outweighs
their
incidental
importance
in
bringing into question
the claim that learning is impossible without
drive reduction. It is not at all unlikely that
insights
arising
from
animal
behavior
studies
with
this
broadened
scope
may
have
the
kind
of
relevance
to
such
complex
activities
as
verbal
behavior
that
reinforcement
theory
has,
so
far,
failed
to
exhibit.
In
any
event,
in
the
light
of
presently available evidence, it is
difficult to see how anyone can be willing to
claim that reinforcement is necessary for
learning,
if
reinforcement
is
taken
seriously
as
something
identifiable
independently
of
the
resulting
change
in
behavior.
Similarly, it
seems quite beyond question that children acquire
a good deal of their verbal and nonverbal behavior
by
casual observation and imitation of
adults and other children.30 It is simply not true
that children can learn language
only
through
care
on
the
part
of
adults
who
shape
their
verbal
repertoire
through
careful
differential
reinforcement, though it may be that
such care is often the custom in academic
families. It is a common observation
that a young child of immigrant parents
may learn a second language in the streets, from
other children, with amazing
rapidity,
and that his speech may be completely fluent and
correct to the last allophone, while the
subtleties that
become second nature to
the child may elude his parents despite high
motivation and continued practice. A child may
pick up a large part of his vocabulary
and
adults, etc. Even a very young
child who has not yet acquired a minimal
repertoire from which to form new utterances
may imitate a word quite well on an
early try, with no attempt on the part of his
parents to teach it to him. It is also
perfectly obvious that, at a later
stage, a child will be able to construct and
understand utterances which are quite new,
and
are,
at
the
same
time,
acceptable
sentences
in
his
language.
Every
time
an
adult
reads
a
newspaper,
he
undoubtedly comes upon countless new
sentences which are not at all similar, in a
simple, physical sense, to any that he
has
heard
before,
and
which
he
will
recognize
as
sentences
and
understand;
he
will
also
be
able
to
detect
slight
distortions or
misprints. Talk of
title. These
abilities indicate that there must be fundamental
processes at work quite independently of
the environment. I have been able to
find no support whatsoever for the doctrine of
Skinner and others that slow and
careful shaping of verbal behavior
through differential reinforcement is an absolute
necessity. If reinforcement theory
really requires the assumption that
there be such meticulous care, it seems best to
regard this simply as a reductio ad
absurdum argument against this
approach. It is also not easy to find any basis
(or, for that matter, to attach very much
content) to the claim that reinforcing
contingencies set up by the verbal community are
the single factor responsible for
maintaining the strength of verbal
behavior. The sources of the
present.
Reinforcement undoubtedly plays a significant
role, but so do a variety of motivational factors
about which
nothing serious is known in
the case of human beings.
As
far
as
acquisition
of
language
is
concerned,
it
seems
clear
that
reinforcement,
casual
observation,
and
natural
inquisitiveness
(coupled with a strong tendency to imitate) are
important factors, as is the remarkable capacity
of the
child to generalize,
hypothesize, and
ways which we cannot
yet describe or begin to understand, and which may
be largely innate, or may develop through
some sort
of
learning
or through
maturation
of the nervous
system.
The manner
in
which such factors
operate
and
interact in language acquisition is
completely unknown. It is clear that what is
necessary in such a case is research, not
dogmatic and perfectly arbitrary
claims, based on analogies to that small part of
the experimental literature in which
one happens to be interested.
The pointlessness of these claims
becomes clear when we consider the well-known
difficulties in determining to what
extent
inborn
structure,
maturation,
and
learning
are
responsible
for
the
particular
form
of
a
skilled
or
complex
performance.31 To
take just one example,32 the gaping response of a
nestling thrush is at first released by jarring of
the nest, and, at a later stage, by a
moving object of specific size, shape, and
position relative to the nestling. At this
later stage the response is directed
toward the part of the stimulus object
corresponding to the parent's head, and
characterized by a complex
configuration of stimuli that can be precisely
described. Knowing just this, it would be
possible to construct a speculative,
learning-theoretic account of how this sequence of
behavior patterns might have
developed
through
a
process
of
differential
reinforcement,
and
it
would
no
doubt
be
possible
to
train
rats
to
do
something
similar. However, there appears to be good
evidence that these responses to fairly complex
genetically determined and mature
without learning. Clearly, the possibility cannot
be discounted. Consider now the
comparable case of a child imitating
new words. At an early stage we may find rather
gross correspondences. At a later
stage, we find that repetition is of
course far from exact (i.e., it is not mimicry, a
fact which itself is interesting), but
that it reproduces the highly complex
configuration of sound features that constitute
the phonological structure of the
language in question. Again, we can
propose a speculative account of how this result
might have been obtained through
elaborate arrangement of reinforcing
contingencies. Here too, however, it is possible
that ability to select out of the
complex
auditory
input
those
features
that
are
phonologically
relevant
may
develop
largely
independently
of
reinforcement,
through
genetically
determined
maturation.
To
the
extent
that
this
is
true,
an
account
of
the
development
and
causation
of
behavior
that
fails
to
consider
the
structure
of
the
organism
will
provide
no
understanding of the real processes
involved.
It is often argued that
experience, rather than innate capacity to handle
information in certain specific ways, must be
the
factor
of
overwhelming
dominance
in
determining
the
specific
character
of
language
acquisition,
since
a
child
speaks the language of the group in
which he lives. But this is a superficial
argument. As long as we are speculating, we
may
consider
the
possibility
that
the
brain
has
evolved
to
the
point
where,
given
an
input
of
observed
Chinese
sentences,
it
produces
(by
an
induction
of
apparently
fantastic
complexity
and
suddenness)
the
rules
of
Chinese
grammar,
and
given
an
input
of
observed
English
sentences,
it
produces
(by,
perhaps, exactly
the
same
process
of
induction)
the
rules
of
English
grammar;
or
that
given
an
observed
application
of
a
term
to
certain
instances,
it
automatically
predicts
the
extension
to
a
class
of
complexly
related
instances.
If
clearly
recognized
as
such,
this
speculation is neither unreasonable nor
fantastic; nor, for that matter, is it beyond the
bounds of possible study. There
is of
course no known neural structure capable of
performing this task in the specific ways that
observation of the
resulting
behavior might lead us to postulate; but for that
matter, the structures capable of accounting for
even the
simplest kinds of learning
have similarly defied detection.33 Summarizing
this brief discussion, it seems that there is
neither empirical evidence nor any
known argument to support any specific claim about
the relative importance of
from
the
environment
and
the
contribution
of
the
organism
in
the
process
of
language
acquisition.
VI
We now turn to the system
that Skinner develops specifically for the
description of verbal behavior. Since this system
is
based on the notions stimulus,
response, and reinforcement, we can conclude from
the preceding sections that it will be
vague and arbitrary. For reasons noted
in Section 1, however, I think it is important to
see in detail how far from the
mark any
analysis phrased solely in these terms must be and
how completely this system fails to account for
the facts of
verbal
behavior.
Consider
first
the
term
verbal
behavior
itself.
This
is
defined
as
reinforced
through
the
mediation
of
other
persons
(2).
The
definition
is
clearly
much
too
broad.
It
would
include
as
verbal
behavior,
for
example, a rat pressing the bar in a
Skinner-box, a child brushing his teeth, a boxer
retreating before an opponent, and
a
mechanic repairing an automobile. Exactly how much
of ordinary linguistic behavior is verbal in this
sense, however, is
something of a
question: perhaps, as I have pointed out above, a
fairly small fraction of it, if any substantive
meaning is
assigned to the term
reinforced. This definition is subsequently
refined by the additional provision that the
mediating
response of the reinforcing
person (the listener) must itself
behavior
of
the
speaker
(225,
italics
his).
This
still
covers
the
examples
given
above,
if
we
can
assume
that
the
reinforcing behavior of the
psychologist, the parent, the
opposing
boxer, and the paying customer are the result of
appropriate
training,
which
is
perhaps
not
unreasonable.
A
significant
part
of
the
fragment
of
linguistic
behavior
covered by the earlier definition will
no doubt be excluded by the refinement, however.
Suppose, for example, that
while
crossing the street I hear someone shout Watch out
for the car and jump out of the way. It can hardly
be proposed
that my jumping (the
mediating, reinforcing response in Skinner's
usage) was conditioned (that is, I was trained to
jump)
precisely in order to reinforce
the behavior of the speaker; and similarly, for a
wide class of cases. Skinner's assertion
that with this refined definition
appears to be grossly in error.
VII
Verbal
operants
are
classified
by
Skinner
in
terms
of
their
relation
to
discriminated
stimulus,
reinforcement, and other verbal
responses. A mand is defined as
a
characteristic
consequence
and
is
therefore
under
the
functional
control
of
relevant
conditions
of
deprivation
or
aversive stimulation
a host
of problems. A mand such as Pass the salt is a
class of responses. We cannot tell by observing
the form of a
response
whether
it
belongs
to
this
class
(Skinner
is
very
clear
about
this),
but
only
by
identifying
the
controlling
variables. This is generally
impossible. Deprivation is defined in the bar-
pressing experiment in terms of length of time
that the animal has not been fed or
permitted to drink. In the present context,
however, it is quite a mysterious notion.
No attempt is made here to describe a
method for determining
experimenter. If we define
deprivation in terms of elapsed time, then at any
moment a person is in countless states of
deprivation.34 It appears that we must
decide that the relevant condition of deprivation
was (say) salt-deprivation, on
the
basis of the fact that the speaker asked for salt
(the reinforcing community which
predicament). In this case, the
assertion that a mand is under the control of
relevant deprivation is empty, and we are
(contrary to Skinner's intention)
identifying the response as a mand completely in
terms of form. The word relevant in
the
definition above conceals some rather serious
complications.
In the case of the mand
Pass the salt, the word deprivation is not out of
place, though it appears to be of little use for
functional analysis. Suppose however
that the speaker says Give me the book, Take me
for a ride, or Let me fix it. What
kinds of deprivation can be associated
with these mands? How do we determine or measure
the relevant deprivation? I
think
we must conclude in
this
case,
as
before, either that
the
notion
deprivation
is
relevant
at
most
to
a minute
fragment of verbal behavior, or else
that the statement
Y,
The
notion aversive control is just as confused. This
is intended to cover threats, beating, and the
like (33). The manner
in which aversive
stimulation functions is simply described. If a
speaker has had a history of appropriate
reinforcement
(e.g., if a certain
response was followed by
been followed
by such injury and which are therefore conditioned
aversive stimuli
response when the
threat which had previously been followed by the
injury is presented. It would appear to follow
from
this description that a speaker
will not respond properly to the mand Your money
or your life (38) unless he has a past
history of being killed. But even if
the difficulties in describing the mechanism of
aversive control are somehow removed
by
a more careful analysis, it will be of little use
for identifying operants for reasons similar to
those mentioned in the
case of
deprivation.
It seems, then, that in
Skinner's terms there is in most cases no way to
decide whether a given response is an instance of
a particular mand. Hence it is
meaningless, within the terms of his system, to
speak of the characteristic consequences
of
a
mand,
as
in
the
definition
above.
Furthermore,
even
if
we
extend
the system so
that mands can
somehow
be
identified,
we
will
have
to
face
the
obvious
fact
that
most
of
us
are
not
fortunate
enough
to
have
our
requests,
commands, advice, and so on
characteristically reinforced (they may
nevertheless exist in considerable strength).
These
responses could therefore not be
considered mands by Skinner. In fact, Skinner sets
up a category of
(48-49) to cover the
case of
be accounted for by showing
that
they have
ever had the
effect
specified
or
any
similar
effect
upon
similar
occasions
(the
word
ever
in
this
statement
should
be
replaced
by
characteristically).
In
these
pseudo-mands,
speaker
simply
describes
the
reinforcement
appropriate
to
a
given
state of deprivation or aversive
stimulation.
reinforcement and
deprivation, the speaker asks for what he wants.
The remark that
mands on the analogy of
old ones
Skinner's claim that his new
descriptive system is superior to the traditional
one
respect to experimental
operations
pointing
out
a
relation
between
rate
of
bar-pressing
and
hours
of
food-
deprivation;
replacing
wants
Y
by
is
deprived of Y
analysis of
mands is that it provides an objective basis for
the traditional classification into requests,
commands, etc.
(38-41). The traditional
classification is in terms of the intention of the
speaker. But intention, Skinner holds, can be
reduced to contingencies of
reinforcement, and, correspondingly, we can
explain the traditional classification in terms