Noam Chomsky – The biolinguistic turn lecture notes – part one
2 Votes
Almost
exactly 35 years ago I had the opportunity to give several lectures
here, the same auditorium I think on the topic “language and mind”.
And quiet a lot has been learned in the intervening years about
language and the brain hence the mind, in the sense I used the term
then, the term mind, mental and such terms.
Using these
terms as just descriptive terms for certain aspects of the world.
Pretty much on a par with such descriptive terms as chemical or
optical, electrical and so on. These are terms used to focus
attention on particular aspects of the world that seem to have a
rather integrated character and to be worth considering for special
investigation. But without any illusions that they “cut
nature at the joints“.
In
those earlier lectures I took for granted that human language can
reasonable be studied as part of the world. Specifically as a
property of the human organism, mostly the brain, and for convenience
I keep to that. Both then and now, I am adopting what Lyle Jenkins
called the BIOLINGUISTIC
PERSPECTIVE.
Thats the framework whithin which the approach to language that I am
considering developed about 50 years ago. Also for convenience I use
the term language to refer to human language. Thats a specific
biological system. There is no meaningful question as to wether the
communication system of bees or what might be taught to aps or
mathematics or music are languages or wether airplanes really fly or
submarines really swim or other wether computers think or translate
languages or other comparably meaningless questions many of them
based on a missinterpretation of an important paper by Alan
Turing
in 1950. Which respond a large and mostly misguided literature,
despite Turings very explicit warning not to pursue that direction
which has apparently been overlooked.
From the
Biolinguistic perspective language is a component of human biology
more or less a par with mammalian vision or insect navigation and
other systems for which the best theories that have been deviced
attribute computational capacity of some kind. Whats in informal
usage sometimes called rule following f.e. a contemporary text on
vision descripes the so called rigidity princible (was formulated
about 50 years ago) as follows: “If possible the rules permit
interpret image motions as projections of rigid motions in three
dimensions.” In this case, later work provided substantial insights
into the mental computations that seem to be involved when the visual
system follows these rules in informal terminology. But even for
simple organisms thats no slight task. Great many issues remain
unresolved in these areas which are quiet obscure even for
insects.
The decision to study language as part of the
world, in this sense, should be in my view uncontroversial but it has
not been. On the contrary. The assumption that this is legitimite
enterprise was pretty forcefully rejected and continues to be
rejected. Virtually all of contemporary philosophy of language and
mind is based on rejection of this assumption. The same is true for
what is called the computer model of mind. That underlays a good deal
of theoretical cognitive science denied in this case not only for
language but for mental faculties generally. Its explicitly denied in
the technical, linguistic literature in what I call platonistic
account of language and also in a different way denied in the
conceptualism that is deviced by the same authors inaccurately
attributed to many linguists including me.
It is also
apparently denied by many sociolinguists, its incompatible with
structural-behavioral approaches to language. Its, little to my
surprise, rejected by current studies on language by leading neuro
sciences. Most notably Terrence
Deacon
in recent work which has been favorably received by eminent
biologist. The approach therfor seems to be controversal but I think
the appearances are missleading. A more carefull look will show I
think, that the basic assumptions are tacitly adopted even by those
who strenuously reject them and indeed have to be adopted even for
coherence.
I am put aside this interesting topic of
contemporary intellectual history and I simply assume that lanugage
can be studied as part of the world. I continue in other words to
pursue the biolingusitic approach that took shape half a century ago,
heavenly influenced by ethology, comparative psychology and
intensifily pursued than along quiet a few different paths including
much of the work that claims to reject the approach.
Noam Chomsky – The biolinguistic turn lecture notes – part two
1 Votes
Assuming
that, I turn to some things that ought be obvious. It can scarcely be
denied that some internal state is responsible for the fact that I
speak and understand some variety of whats loosly called english but
not say, hindi or korean. To borrow and in fact adapt a traditional
term we can call this state, wherever it is, thats internal to me a
state of the human faculty of language, primarily a state of the
brain. We can call each such state an internalized language in
technical literatur often called an I
language.
For simplicity I call it that. It should also be uncontroversial that
the faculty of language has an initial state, part of our biological
endownment, which permits a certain range of options – the
attainable I languages.
The faculty of language then is a
special property that enables my granddaughter but not her pet kitten
or chimpanzee to attain a specific I language on exposure to
appropriate data, data which her mind in some obscure way is able to
extract from the glooming buzzing confusion and interpret as
linguistic experience. That is no slight task. Nobody knows how its
done, but it obviously is. More accurately every infant acquires a
complex of such states, thats a complication error, but I put that
aside. The expectation that language is like everything else in the
organic world and therfor is based on a genetically determined
initial state that distinguishes, say my granddaughter from my pets.
That assumption has been called the innateness hypothesis. There is a
substantial literature debating the validity of the innateness
hypothesis.
The literatur has a curious character. There are lots of
condemnations of the hypothesis but its never formulated. And nobody
defends it. Its alleged advocates, of whom I am one, have no idea
what the hypothesis is. Everyone has some innateness hypothesis
concerning language, at least everyone who is interested in the
difference between an infant and say her pets.
Furthermore
the invented term -innateness hypotheses- is completely meaningless.
There is no specific innateness hypothesis rather there are various
hypothesis about what might be the initial genetically determined
state. These hypothesis are of course constantly changing as more is
learned. That all should be obvious. Confusion about this matters has
reached such extreme levels that it is becoming hard even to unravel,
but I put this aside.
The biolinguistic approach takes
mental faculties to be states of the organisms. In particular
internal languages (I languges) are states of the faculty of
language. I focus on language but most of what follows should hold as
well for other cognitive faculties and in fact for far simpler
organisms (bee communication or navigation). Well, when we adopt this
approach several questions arise at once.
The central one
is to determin the nature of the initial and attained states. And tho
the matter appears to be controversal I know of no serious
alternative to the thesis that these are in substantial measure
computational states wether we have in mind insect navigation or what
you and I are doing right now. Again, thats held to be controversal
but since there is no alternative ideas I dont understand why. Its
held to be controversal for humans. Its not held to be controversal
for say insect navigation but the question is about the
same.
Investigation of the brain in these terms is
sometimes called psychological and its contrasted with investigation
in terms of cells, chemical processes, electrical activity and so on
that is called physiological. These are again terms of convenience,
they dont have any sharp boundaries. Chemistry and Physics where
distinguished in pretty much the similar way not very long ago. The
formular involving complex molecules that we now study in school.
These where pretty recently considered to be “merely
classificatory symbols that summaries the observed course of the
reaction. The ultimate nature of the molecular groupings was held to
be unsolvable and the actual arrangements within a molecule, if this
means anything, was never to be read into the formular”.
Kekulé
whos structural chemistry paved the way to eventual unification of
chemistry and physics. He doubted that absolute constitution of
organic molecules could ever be given. His own models, his analysis
of valency
and so on where to have only an instrumental interpretation as
calculating devices. Large parts of physics where understood in the
same way by prominent scientists including the Molecular
theory of gases,
even Bohr’s
modell of the atom.
In fact, only a few years before physics and chemistry where united
in Linus
Pauling
account on the chemical
bond,
Americas first nobel price winning chemist dismissed talk about the
real nature of chemical bond as in his therms “metaphysical
twaddle, this was nothing more than a very crueld method of
representing certain known facts about chemical reactions, a mode of
representation only” just a calculating device. The rejection of
this skepticism by a few leading scientists, whos views where
incidentally condemned as a conceptual absurdity, paved the way to
the eventual unification.
This very recent debates,
talking about the 1920s’ in the hard sciences, I think have
considerable relevance for todays controversies in computational
theories of cognitive capacity – thats from insects to humans.
Important topic, one that I discussed little bit elsewhere, that
deserves more attention than it recieves.
Noam Chomsky – The biolinguistic turn lecture notes – part three
Rate This
Well
with the biolinguistic approach in place we want to discover the
relationship between psychological states and the world as described
in other terms. We want to know how computational states are related
to neurophysiological states or represented in one terminology. We
also want to find out how those mental states relate to the organism
external world. As for example when the motions and noises produced
by our forager bee direct others to a distanced flower or when I
talked about a recent trip to india. Or when I say that I recently
read Darwin’s Decent of Men but “Men” referring to a book. All
of this is called intentionality in philosophical jargon.
The
broad issuses were raised permanently at the end of the decade of the
brain which brought the last millenia to a close. The American
Academy of Arts and Sciences
at the end of the millenium in the year 2000 published a volume to
mark the occasion. It summarized the current state of understanding
in these areas. The guiding theme of the volume was formulated by a
distinguished neuroscientist Vernon
Benjamin Mountcastle
in the introduction to the collection. It is in his words “the
thesis that things mental indeed minds are emergent properties of
brains while these emergencies are not regarded as irreducible but
are produced by principles that control the interaction between lower
level events, principles we do not yet understand.” That same
thesis has been put forth in recent years as a “astonishing
hypothesis of the new biology” a “radical new idea in the
philosophy of mind” the “bold assertion that mental phenomena are
entirely natural and caused by neuro-physiological activities of the
brain” opening the door to new and promising inquiry and so
on.
Contributors to the American Academy Volume where for
the most part quiet optimistic about the prospects about the
remaining gaps between psychological and physiological accounts.
Mountcastle’s phrase “we do not yet understand” reflect that
optimism. Suggests we will soon understand. He wrote that
“researchers speak confidently of a coming solution to the
brain-mind
problem”
similar confidence has been expressed for half a century including
announcements by prominent scientists, nobel price winner in one
case, that the brain-mind problem has already been solved.
We
may recall usefully similar optimism. Shortly before the unification
of chemistry and physics, in 1929 Bertrand Russels who new the
sciences well, he wrote that “chemical laws cannot at present be
reduced to physical laws”. In his phrase “at present” like
Mountcastle’s “yet” expresses the expectation that the
reduction should take place in the course of scientific progress
perhaps soon. Now in the case of physics and chemistry it never did
take place. What happend was something different and totally
unexpected, namely unification of a virtually unchanged chemistry
with a radically revised physics. And its hardly necessary to stress
the fact that the state of understanding and achievment in these
areas, 50 – 80 years ago, was far beyond anything that can be
claimed for the brain and cognitive sciences today. With outh to give
us pause.
The American Academy Volume reviews many
important discoveries but the leading thesis should arouse our
skepticism. Not only for the reason that i just mention. Another
reason is that the thesis is by no means new. In fact it was
formulated in virtually the same words two centuries ago, late 18th
century, by the eminent chemist Joseph
Priestley.
He wrote that “properties of mind arise from the organisation of
the nervous-system itself and those properties termed mental are the
result of the organic structure of the brain”. Just as matter is
possessed of powers of attraction and repulsion that act as a
distance contrary to the founding princibles of the modern scientific
revolution from Galileo
and Newton and beyond.
Noam Chomsky – The biolinguistic turn lecture notes – part four
1 Votes
Half
a century before Priestley, David Hume had casually described thought
as “a little agitation of the brain” and shortly after the French
philosopher physician Cabanis
wrote that the brain “must be considered a special organ designed
to produce thought as the stomache and the intestine are designed to
operate the digestion, the liver to filter foil and various glance to
produce salivary juices”. A century later Darwin asked rhetorically
“why thought beeing a secretion of the brain should be considered
more wonderful than gravity which is a property of matter”.
Actually these and many other conception developed from an inquiry
from what was called “thinking
matter“,
in part developed from what sometimes called by historians of
philosophy John Lockes’ suggestion, that is his observation that
“god might have choosen to superadd to matter a faculty of thinking
just as he annexed effects to motion which we can in no way conceive
motion able to produce”. The theological apparatus may well have
been for self defense as Lockes.
correspondance suggests.
By the late 18th century the
thesis was widely regarded as inescapable. Newton has demonstrated to
his considerable dismay that matter does not exist in the sense of
the galilean revolution and of the scientists of his own day and his
own sense. That beeing the case, the mind – body problem could not
even be formulated, at least in anything resembling the classical
form. Current formulation seem at best to restate the problem of
unification of psychological and physiological approaches and to do
so in highly missleading terminology. There was no mind – body
problem anymore than there was a chemistry physics problem in the
1920s’.
Newtons discoveries lead to no coherant
alternative to the conclusion that was drawn by Hume,
priestly and others and rediscovered today in pretty much the same
terms. But with the problem of emergence as unresolved as it was two
centuries ago. That includes the question wether this notion with its
reductionist connotation is even the right notion, maybe is the wrong
notion as proved to be the case for chemistry and physics.
The
traditional mind-body problem is often ridiculed as a problem of the
“ghost in the machine”. But this is a misconception. Newton
exercised the machine, he left the ghost completely intact. A similar
observation has made very recently by two physicists Paul
Davis
and John
Gribbin
concluding in a book of theirs, the
matter myth,
they write that “during the triumphal phase of materialism and
mechanism in the 1930s, Gilbert Ryle derided mind-body dulism in a
pity reference to the mind part as the “ghost in a machine”. But
already when he called the pity expression in the 1930s the new
physics was at work undermining the materialist world view on which
Ryle’s philosophy was based.
By the end of the 20th
century they continue “we can see that Ryle was right to dismiss
the notion of the ghost in the machine not because there is no ghost
but because there is no machine”. There point is correct but the
timing is of by at least two centuries, actually three, althou it
take some time for Newtons demolition of the mechanical philosophy –
the believe the world was a machine. It took a little time for that
to enter scientific common sense.
Newton himself was well
aware of the conclusion and far from pleased by it. He regarded his
own conclusion as an absurdity that no serious person could
entertain. And he saw away to the end of his life as did prominent
scientist of his day and much later always in vain. Over time it came
to be recognized that Newton had not only effectively destroyed the
entire materialist physicalist conception of the universe but he had
also undermined the standards of intelligibility on which the early
scientific revolution was based. The outcome is familiar in the
history of science. It was described very well in the classic 19th
century history of materialism by Manuel
deLanda.
He pointed out that “scientists have accustomed themselves to the
abstract notion of forces, or rather a notion covering in mystic
obscurity between abstraction and concrete comprehension”. A
turning point in the history of materialism that removes the
surviving remanence of the doctrine far from the ideas and concerns
of the genuin materialists of the 17th centuries and deprives them
from any significance. That too is now a virtual truism at least
among historians of science. One of the founders of the modern
discipline, Alexander
Koyre,
he wrote 40 years ago that “a purly materialistic or mechanistic
physics is impossible and we simply have to accept that the world is
constituted of entities and processes that we cannot intuitively
grasp”.
The problems of emergence and unification take
on an entirely new form in the post newtonian era, a form that is
furthermore unstable, changing as science comes to accommodate new
absurdities as they would have been regarded by the founding figures
of the scientific revolution including Newton. And I know of no
reason to suppose that this process has come to an end. It is worth
pointing out that the only part of our knowledge, or what we take to
be knowledge, for which we can claim much confidence is our mental
world. That is the world of our experience. As reflective beeings
humans try in various ways to make sense out of this experience. One
part of this effort is sometimes called folk science. When its
conducted in a more systematic careful controlled way we nowadays
call it science.
One standard conclusion of contemporary
science is that each organism humans in particular reflexively
develop, what ethologists call an umwelt – a particular mode
constructing and interpreting experience given the data of sense.
This is quiet different for us and for bees for example. Furthermore
there is no great chain of being.
Noam Chomsky – The biolinguistic turn lecture notes – part five
1 Votes
In
fundamental respects, insects have richer experience and more
sophisticated ways of dealing with it for action than humans do.
Among other standards conclusion of modern science there are those
that Priestley and many others drew centuries ago about thinking
matter reiterated at the end of the decade of the brain, just two
years ago without notable change or maybe surprisingly without much
awareness that its revival not innovation. And thats it revival of
something it was take to be unavoidable truism two centuries ago for
quiet good reasons given the lack of a positive determinate account
of the non mental part of the world, what is sometimes called the
physical world.
Talk of the hard part of the mind-body
problem, in recent years thas been taken to be conciousness
conventionally. That talk of that kind is missleading at best. If its
even meaningful. It may not be. Sometimes the problem is not quiet
clearly posed, that its posed in terms of questions to which we can’t
even think of wrong answers. So for example, there is no sensible
answer to the question, what is it like to be me? Or what is it like
to be a bat in Thomas
Nagel
famous paper. There are bad answers to that there are no good
answers. Formal semantic inquiries often take the meaning of a
question to be the set of propositions that are answers to it. And if
that is at least a condition on meaning, than it follows that if
there are no sensible answers, the question has no meaning. Even when
legitimate questions are posed we dont have any good reason as far as
I can see to suppose that they are intrinsicly harder than lots of
other problems. Say the problem is posed for our understanding by
quantum mechanics or cosmological theories of an infinity of
universes or for that matter for the properties of motion.
We
dont have any reason that I know of to question the opinion of
Newton, David Hume and other not inconsiderable figures who in
various ways reached Locks’ conclusion that motion has effects
which we can in no way conceive motion able to produce. Even before
Newton, puzzlement about motion was profound. His precursor Sir
William Paddy
described springing or elastic motion as the hard rock in philosophy.
Philosophy means what we call science. The abscurity was so great,
Robert
Boyle
felt, as to prove the existence of an intelligent author or disposer
of things. Even the Skeptical Newtonian Voltair
felt that the “impenetrable mysteries of motion proved that there
must be a god who gave movement to matter” rather than Locks
suggestion.
One cannot say that the hard problem was
solved. It was just abandoned in the course of a significant revision
of the enterprise of science. That is the recognition that in some
fundamental sense, the world is just an unintelligently to us. And
that we have to reduce our sights to the search for intelligible
theories. Thas something quiet different. And even that goal has been
strongly contested by prominent physicists. For example in the
critique a century ago of atomic theory or even of the idea that
physics should go beyond establishing quantitative relations between
observable phenomena. The significance of this shift should not be
underestimated. It was recognized soon enough, f.e. by David Hume who
wrote that “Newtons discoveries reveal the obscurity in which
Natures ultimate secrets will remain”.
These mysteries
of nature, as Hume called them, refering to the phenomenon of motion
will remain beyond our cognitive reach. Perhaps we might speculate he
didn’t for reasons that are rooted in the biological endowment of
the curious creature that alone is able even to contemplate these
questions.
Well, I did talk about these topics 35 years
ago and whats happen since including incidentally my own delayed
self-education inclines me to believe that what I said then should be
reiterated much more forcefully and in much greater depth and with
much more explicit connections drawn to contemporary discussions
about problems of language and mind.
Well lets return to
the narrower question of emergence of mental aspects of the world or
perhaps the development of an account of the non mental world that
can be unified with them if the physic-chemistry model turnes out to
be accurate. This scale of the gap that remains, and the very dubious
grounds for the general optimism about overcoming it are revealed
very clearly in the American Academy Symposium that reviewed the
state of understanding at the end of the millennium. One leading
specialist on vision who was toward the optimistic end of the
spectrum never the less reminded the reader that how the brain
combines the responses of specialized cells to indicate a continuous
vertical line is a mystery that neurology has not yet solved. Or even
for that matter how one line is differentiated from others or from
the visual surround.
Noam Chomsky – The biolinguistic turn lecture notes – part six
Rate This
The
American Association for Advancement of Science Journal, devoted a
year ago an issue to Neuroscience. The summery article, which was
coauthored by Eric
Richard Kandel
Nobel Laureate, was subtitled “Breaking down scientific barriers to
the study of brain and mind”. The article covers very interesting
ground but ends up with the conclusion that the neuroscience of
higher cognitive processes is only begining. Its surely beginning
from a higher plane than was constructed by Descartes who was in many
ways the founder of modern Neuroscience. But non the less it is still
the beginning. Fundamental questions remain beyond even dreams of
resolutions. That includes those that where traditionally considered
at the heart of the theory of mind. Such as for example, choosing
some action, or even thinking about doing so. There has been very
valuabel work about narrower questions, f.e. how an organism executes
a plan for integrative motor action – how a cockroach walks or how
a person reaches for a cup of the table.
But no one even
raises the question of why the person or the cockroach executes one
plan rather than some other one. That question is raised for the very
simplest organisms, single cells organisms. In fact the same is true
even for visual perception which is often considered a passive
process. A couple of years ago a few cognitive neuroscientists, one a
college of mine, published a review of research on a problem that was
posed in 1850 by Humhold “even without moving our eyes we can focus
our attention on different objects at will resulting in very
different perceptual experiences of the same visual field”.
There
is been interesting work on that but the phrase “at will” points
to an era thats beyond serious empirical inquiery. It remains as much
of a mystery as it was for Newton at the end of his life when he was
still seeking what he called a “settle spirit that lies hidden in
all bodies and that might without absurdity account for their
properties of attraction and repulsion, the nature and effects of
light, sensation and the way members of animal bodies move at the
command of the will.” These where all comparable mysteries for
newton perhaps even beyond our understanding he thought. Like the
princibles of motion and the classical problems of the theory of mind
at least since Descartes who incidentally also regarded them as
possibly beyond human understanding.
Even if we restrict
ourselfs to the study of mechanisms the gaps are quiet substantial.
One of the leading Neuroscientists Randy Gallistel pointed out
recently that “we clearly do not understand how the nervoussystem
computes or even the foundation of its ability to compute even for
the small set of arithmetic and logical operations that are
fundamental to any computation”. He happens to be talking about
insects but it obviously extends beyond. In another domain one of the
founders of contemporary cognitive Neuroscience, Hans
Lukas Teuber.
He
introduced a n important review on perception and neuropysiology by
writting “it may seem strange to begin with the claim that there is
no adquate definition of perception and to end with the admission
that we lack a neurophysiological theory”. Althou this was the most
that could be said. Its true that that was 40 years ago and there
where dramatic discovery right at the time that he was writting and
since. But i suspect that Teuber would have expressed much the same
judgment today. Teuber also outline the standard way to move towards
adressing the problem of unification. He explains that his purpose in
reviewing the perceputal phenomena and offering a speculative
psychological account of them was because this may suggest direction
in which the search for neural basis of perception should proceed.
Namely by clarifying the assumption that those neurol basis must
satisfy. Thats a classic approach along with the restriction of the
scientific enterprise to more modest goals namely intelligibility of
theory rather then of the world.
Another consequence of
the demolition of the hopes of the Galilean Revolution for mechanical
conception of the world, was recognition that scientific inquiry is
going to have to be local in his expectations. Overarching
unification may take place but perhaps over a long term and in ways
that can’t be anticipated. The 18th century english chemist Joseph
Black
set the tone for subsequent scientific work by recommending that
chemical affinity be received as a first princible which we cannot
explain anymore than Newton could explain Gravitation but let us
defer acounting for the laws for affinity until we have established
such a body of doctrine as Newton has established concerning the Laws
of gravitation. And chemistry in fact preceeded along this course
separating itself increasingly from physics. Physics followed Newtons
admination that Nature will be conformable to herself and very simple
observing a few general principles of attraction and repulsion that
relate the elementary particles of which all matter is constituted.
More or less in a way different buildings can be constructed from the
same bricks.
Noam Chomsky – The biolinguistic turn lecture notes – part seven
Rate This
The goal was therefor to understand, to quantify to reduce the whole of nature to simple laws as Newton did; for say Astronomy. Arnold Thackray on his History of Newtonian Matter Theory and the Development of Chemistry said that “this was the compelling, this was the enticing, indeed the almost bewitching goal of much work. Pursuing the thornily Newtonian and reductionist task of uncovering the general mathematical laws which govern all chemical behavior.”
There
was a distinct chemical tradition, followed the path that was
outlined by Joseph
Black
who more or less founded modern chemistry and tried to keep neutral
probably to avoid controversy. But his own work helped to found a
separate chemical track. John
Dalton
abandond and entirely Newton corpuscular theory of matter. He adopted
the radically different view that matter could exist in heterogeneous
forms with very princibles. His approach Stackly writes “was
chemically successful and therefor enjoyed the homage of history
unlike the philosophically more coherent if less successful
reductionist schemes of the Newtonians’.”
By the end
of the 19th century, the fields of interest of chemists and
physicists had become quiet distinct, quoting a standard history of
chemistry “chemistry dealt with the world consisting of some 90 odd
material elements with many and very principles and properties while
physicists handled a more nebulous mathematical world of energy and
electromagnetic waves that where perceived in light, radiant heat,
electricity, magnitism later radiowaves and x-rays. The chemists’
matter was discrete and discontinuous, the physicists energy was
continuous. And the gap appeared unbridgeable. Meanwhile chemists
developed rich body of doctrine achieving chemistries triumphs in
isolation from the newly emerging science of physics. As I mentioned
the isolation ended only recently in a completley unanticipated way,
not by reduction but by unifying a radically revised physics with the
bodies of doctrin that chemistry had accumulated. Which had in fact
provided important guidlines for the reconstruction of physics,
basically Tubers point about perception.
And thats happen
often in the history of science and we cannot know wether something
similar might be required for unification of the study of brain and
mind. Assuming this to be a task within our cognitive reach. And yet
we dont know either.
Well, I have already suggested and
will repeat that there are interesting and important parallels
between the debates concerning the reality of chemistry up to
unification which was just 56 years ago and current debates in the
philosophy of mind about the reality of the constructions of
psychological approaches. The former debate (chemistry and physics) –
They are now understood to have been totaly pointless based on
serious missunderstanding.
We simply have no grasp of
reality other than what our best explanatory theory can provide. If
they happen to be computational theories, ok, thats reality. My own
view, I discussed it elswhere, that current debates very much alive
right now are also largly pointless and for essentially the same
reasons. This includes central topics of philosophy of mind and
theoretical cognitive science. Which those of you in the discipline
will recognize.
Considerations of the kind that i have
been reviewing, these where in the background of the so called
cognitive revolution of the 1950s, at least for some of the
participants, althou it was unknown at the time. In many ways the
shift of perspective brought about by the cognitive revolution
actually recapitulated the first cognitive revolution of the 17th
century. That includes the focus on vision and language, in the
latter case adobting the biolinguistic approach. That is shifting
focus of attention from phenomena like behaviour and its products,
say texts to the inner mechanisms that enter into producing the
pheonomena. Thus a shift, but its actually a shift that was taken in
the 17th century. There was regression in a long time.
Noam Chomsky – The biolinguistic turn lecture notes – part eight
Rate This
Notice
again that that shift still leads us a long way from the problems of
actions. Thats a vastly different matter. I have myself often quoted
Wilhelm
von Humboldt’s
aphorism that the core problem of the study of language is the
“infinite use of finite means”. It was a leading concern of
Cartesian philosophy before him and a problem that really could not
be posed until the mid 21th century when the concept of recursive
generative proceedures was fully clarified. These proceedures
constitue the finite means that are put to infinite use. But its
important to be aware, I don’t think I stressed this enough, that
despite quiet a lot of progress in understanding the means, means
that are employed for infinite use, the question of how they are used
is scarcely even addressed. And it was that question, that was the
fundamental one for Descart, Humbold and other fairly modern figures.
And again those questions are not even addressed for insects led
alone humans.
Its reasonably clear that the human capacity
for languages, whats called a species property, that is biologically
isolated in essential respects and close to uniform accros the
species. That actually seems less suprising today than it did not
long ago in the light of very recent discoveries about the very
limited genetic variation among humans as compared with other
primates suggestes that we have all descended from a very small
breething group maybe a hundred thousands years ago. Humans are
basically identical from the point of view an outside Biologist
looking at us. The biolinguistic approach adopted from the start what
has been called, I quote the recently published encyclopedia of
cognitive neuro science, “the norm these days in neuroscience, the
modular view of learning, that is the conclusion that in all animals
learning is based on specialised mechanisms, instincts to learn in
specific ways” Randy
Gallistel
again. These organs within the brain perform specific kinds of
computation in accordance with specific design appart from extremely
hostil environments. The organs change state under the triggering and
shaping effect of external factors. They do so more or less
reflexively and in accordance with internal design. Thats the process
of learning although growth might be a more appropriate term,
avoiding misleading connotations of the term learning. The language
organ, the faculty of language fits that normal pattern. According to
the best theories we have, each attainable state of the system (i
language) is a computational system that determines, generates in a
technical sense infinately many expressions.
Each of this
expressions is a store of information about sound and meaning which
is accessed by performance systems. The properties of the I-language
resold from the interplay of several factors. One factor is
individual experience which selects among the options that are
permitted by the initial state. A second factor is the initial state
itself which is the product of evolution. And a third factor is
general properties of organic systems. In this case computational
systems incorporating and its reasonable to expect princibles of
efficient computation.
The general picture involving
crucially the third factor is familiar in the study of organic
systems generally. The classic work of D’Arcy
Thompson
and Alan Turing on organic
form and morphogenesis
is an illustration topic currently in contemporary biology. One
current example might be suggestive in the present context. There is
recent work by Christopher
Cherniak,
Mathematical Biologist in Meryland, whos been exploring the idea that
minimization of wire length – as in microchip design- shall best
produce the best of all possible brains. And he has tried to explain
in this terms the neuroanatomy of nematode
– one of the simplest and best studied organisms. And also various
pervasive properties of nervous systems. Such as the fact that the
brain is as far forward as possible on the body axis. He wants to try
to show thats just a property of efficient computation based on wire
length minimization.
Well, one can trace interest in this
third factor -general properties of organisms- back to a Galilean
intuition, namely his concept that “nature is perfect” from the
tide to the flight of bird. And its the task of the scientist to
uncover in just what sense this is true. Newtons confidence that
Nature must be very simple reflects the same intuition. However
obscure it may be that intuition about what Ernst
Haekel
“Natures drive for the beautiful” has been a driving theme of
modern science since its modern origin with the Galilean Revolution
perhaps its defining characteristic.
It is hard to say
exactly what it is, but that its a guiding intuition is not in doubt.
Biologist however have tended to think rather differently about the
objects of their inquiry. Very commonly they adopt what Francois
Jacob,
Nobel Laureate, image of nature is what he called a tinker – which
does the best it can with the material at hand. Often a pretty rotten
job as human intelligence seems to be keen on demonstrating about
itself.
One well known contemporary Biologist, Gabriel
Dover,
British geneticist. He concludes in a recent book that “biology is
a strange and messy business and perfection is the last word one can
use to describe how organisms work particulary anything produced by
natural selection.” Doe of course produced only in part by natural
selection as he emphasizes, and this any biologist knows, and to an
extend that cannot be quantified by available tools.
Well,
we just dont know which of these conflicting intuition is more
accurate – the galilean intuition or say Jacob’s intuition. And
we will not know until we know the answer. And they seem very remote
as answers. The same author, Gabriel Dover, writes that “we are
nowhere near relieving our deepest ignorance about the biological
world around us” he goes on to reserve his sharpest words ” for
those who seek scientific respectability to complex behavioral
phenomena in humans that we cannot even begin to investigate
seriously”. He calls that “a sign of intellectual laziness at
best and shameless ignorance at worst” when confronting issues of
massive complexity which far exceeds the reach of contemporary
science. He gives some exambles but for charity I ignore them
Noam Chomsky – The biolinguistic turn lecture notes – part nine
Rate This
The
long term goal of investigating the third sector – that is the role
of general properties of organisms in determining the faculty of
language and the states it can attain (internal languages) – was
actually formulated in the early days of the biolinguistic term but
put aside as unfeasible. Attention focused on the first two factors,
experience and the initial state in technical terminology = the
problems of descriptive and explanetory adequacy.
The
latter is how the initial state enters into determining the
transition to the final state – the state attained. The earliest
attempts, 50 years ago, where to replace traditional or structualist
accounts of language by generative rule systems revealed very quickly
that very little was known about the sound, meaning and structure of
language and that huge problems had been unwittingly swept under the
rug. Rather as in the days when it was assumed that bodies fall to
their natural place, as has often been the case, one of hardest steps
in development of the scientist is the first one. Namely to be
puzzled by what seems so natural and obvious and to gain some
realistic sense of what had been overlooked was an enourmous taks in
itself.
Even more so in the light of the recognition that
the apparent complexity and diversity of languages that was very soon
discovered just had to be an illusion. The reason for that conclusion
is a standard one in biology. Namely as in the case of other organs
of the body, experience can play only a very limited role in
determing the state thats attained. In this case the attained I –
language even a young child has mastered a rich and highly articulary
system of sound and meaning and structural properties that goes far
beyond any evidence available and its shared with others who have
different but also highly restricted experience. So it has to be the
case that the initial state plays an overwhelming role in determining
the language that the child attains in all of its aspects. Experience
surly has a role in triggering and shaping role as in the case of
other organs. But it has to be limited one.
So there is no
reason to suppose that language and other higher mental faculties
depart radically known in the biological world. The task was to show
the apparent richness and complexity and diversity is in fact an
illusion. That all languages are cast to the same mold and that
experience serves only to set options within a fixed system of
princibles all determend by the initial state. Which is the case of
other biological systems.
Well. Great deal of research of
the past 40 years in this areas has been driven by a kind of tension
between descriptive and explanatory adequacy. That is the tension
between the search for true theories of i-languages, the attained
state on one hand, and the true theorie of the invariant initial
state of the language organ on the other. The invariant initial state
is the topic of whats come to be called “universal grammar”. It
is adapting an traditional notion to a quiet a new context. The
search for descriptive adequacy, like a true theory of Hungarian,
that leads to complex intricate of particular construction and
particular languages different from one another. In contrast the
search for explanatory adequacy seeks to find the common ground from
which the existing languages arise given the data that are structured
as experience by the operations of the initial states. Again in some
unknown manner.
The first proposals from the 1950s
suggested that the initial state – the topic of universal gramma –
provides a kind of a format for rule systems and organisations and a
proceedure for selecting one instantiation of the format over another
in terms of its succsess in capturing authentic linguistic
generalizations and empirical notion that incorporates also a kind of
a theory internal version of standard best theory considerations. The
rules themselves, at the beginning, where adaptations of informal
traditional notions which had proven to be utterly inadequate when
they where subjected to close examination. So that meant, rules for
forming relative clauses in Hungarian, or passives in japanese, or
causatives in the romans languages.
The general approach
did offer a kind of solution to the core problem of the study of
language. Sometimes called in the literature the logical problem of
language acquisition. That is how does the initial state map
constructive experience to the final state. But as was emphasized,
that solution holds only in princible. Because in practice the
conception was unfeasible because of the astronomical compuational
demands. Well, from about 40 years ago attempts where made to reduce
the scale of the problem by seeking valid general prinicbles that can
be abstracted from particular grammas and attributed to universal
gramma, meaning to the initial state of the language faculty. Leaving
a residue that might be more manageable.
Actually some of
those proposals where kind of proposals that where then beeing
explored and I reviewed in lectures here 35 years ago. After that
time the considerable process took off but it still left the tension
unresolved. That is the general picture was somehow fundamentally
defective. There was no true solution, no feasible solution to the
logical problem of language acquisition. A possible resolution of
that tension was reached after a good deal of effort about 20 years
ago with the crystallization of a picture of language. It marked a
very sharp break from a long and rich tradition, tracing back to
classical india and greece. Sometimes called the Prinicbles and
Parameters approach that dispenses entirely with the core notions of
traditional gramma, notions like gramatical construction, or
grammatical rule. From this point of view, categories such as a
relative clause or passive construction are understood to be real
enough but only as taxonomic artifact. So, f.e aquatic organisms –
which would include say dolphins, trouts, eels, and some bacteria.
Its a category but not a biological category.
Noam Chomsky – The biolinguistic turn lecture notes – part eleven
1 Votes
The
phenomenal properties of these artifacts result from the interaction
of invariant principles of the initial state (=the faculty of
language) with a finite number of parameters fixed in one or another
way. It would incidentally follow that there are only finitely many
possible human languages apart from idiosyncrasy and choice of
lexical items. And even these are sharply constrained. That means
that the problem of unfeasible search is eliminate, its a major
conclusion if correct. The conception has now been applied to
typologically different languages of just about every known kind.
It
lead to many discoveries, a host of new questions that where never
before contemplated, sometimes suggested answers. This princibles and
parameters approach is an approach. Its not a theory. Within the
general approach there are many differse theories. There is actually
a very good introduction to the topic by just published by Mark
Baker
– Atoms
of language.
He himself made major contributions to the approach. He is been
working primarily on languages that appear to be at opposite ends of
the spectrum of typological possibilities. Picking that on purpose of
course. Mohawk and English, thats the pair he studies most intensifly
trying to show that altough they are about as different phenomenally
as two languages can be they are in reality virtuall identical apart
from very small changes in a view parameters. Thake say a Marsian
observer who views Humans as we do other organisms would conclude
that they are essentialy identical. Dialectical varience of the same
language.
There is been extensive work of a similar
character carried out worldwide with quiet revealing results. One
major programm funded by the European Union is studying the vast
number of languages in Europe. Missleadingly called things like
German and Italien and so on thoe the where totaly different
languages. Included by this characterisations. And its beeing done
elsewhere as well. I dont wanna suggest that the approach has been
established. That is very far from true but it has been very
succsessful as a research program. As a stimulus to empirical and
theoretical inquiry. Progress towards the goals of descriptive and
explanatory adequacy has far surpass anything that preceeds. Not only
in depth of analysis of particular languages but also in the range of
typological different languages that have been investigated and also
new areas of linguistic structure that had barrely bin explored
before.
Related field, such as the study of language
aquisition, have also been completely revitalized within a similar
framework. They now look totaly unlike anything that was around 20-30
years ago. There are some important steps towards convergence altho
its certainly gonna be a long and difficult course even the approach
turns out to be on the right track. We are far from having a clear
idea of what the principles and parameters actually are. But I think
it is fair to say that the study of language, in the last 20 years,
has moved to an entirely new plane.
I wanna pick up these
topics tomorrow and then move on to the issues, particlary the third
factor – general properties of organisms. And then to move on the
questions of intentionality. That is the question of how language now
understood within the biolinguistic framework relates to the rest of
the world.