Detail Cantuman Kembali

XML

Kinds Of Minds


THOMAS E. DICKINS & KEITH FRANKISH


Daniel Dennett's new book aims to introduce a general audience to current


work in the evolutionary modelling of minds. This work, of which Dennett is


one of the pioneers, is assuming increasing importance in contemporary


cognitive science. A second, and equally important, aim of the book is to


challenge the Cartesian, anthropomorphic, and realist prejudices of the lay


public.


Dennett begins, in his deceptively easy style, with a list of questions


prompting the reader to consider which organisms might or might not possess


minds. These questions are designed to bring readers up short and force them


to think about where to draw the line:


�We left Houston at dawn, headin' down the road � just me and my truck.�


Strange. If this fellow thinks his truck is such a worthy companion that it


deserves shelter under the umbrella of �we,� he must be very lonely. Either


that, or his truck must have been customized in ways that would be the envy


of roboticists everywhere. In contrast, �we � just me and my dog� doesn't


startle us at all, but �we � just me and my oyster� is hard to take seriously. In


other words, we're pretty sure that dogs have minds, and we're dubious that


oysters do. (p. 4)


This problem is not only of theoretical interest, but also raises important


ethical issues (pp. 4-5). In attributing mentality to a creature, we also grant it


certain rights. Unfortunately, there is no agreement within the scientific and


philosophical communities as to what the possession conditions for minds are.


Dennett begins by warning the reader away from what he takes to be a


blind alley, and sketches once again his well-known, instrumentalist view of the


mind. To have a mind, he points out, is to have intentional states � states with


representational content. And it is easy to set the possession conditions for


such states very high � so high, according to some philosophers, that animals


do not really count as having minds at all. This, Dennett thinks, is a mistake.


Intentionality, he holds, is a widespread natural phenomenon, and human


thought is just a fancier version of the sensitivities and tropisms exhibited by


simple systems such as plants, cells, and primitive robots. We can think of such


systems as pursuing goals and registering environmental information that is


* This is an e-print of a review published in History and Philosophy of Psychology Newsletter, 24,


Spring 1997, pp. 36-40. It includes some corrections and minor revisions.


2


relevant to attaining them. Of course, these systems do not explicitly represent


the goals they seek; but the goals are nonetheless real, since they are the ones


which evolutionary processes (or, in the case of robots, their human creators)


have designed them to pursue. Such systems have reasons for their actions, but


their reasons are, as Dennett puts it, �free floating�. For example, a fledgling


cuckoo ousts the eggs of its unwitting adoptive parents:


Why does it do this? Because those eggs contain rivals for the attentions of its


surrogate providers. By disposing of these rivals, it maximizes the food and


protective care it will receive. The newborn cuckoo is, of course, oblivious; it


has no inkling of this rationale for its ruthless act, but the rationale is there, and


has undoubtedly shaped this innate behaviour over the eons. ... I call such a


rationale �free floating,� because it is nowhere represented in the fledging, or


anywhere else � The strategic principles involved are not explicitly encoded


but just implicit in the larger organization of designed features. (p. 49)


Just as the cuckoo's �dumb�, instinctive behaviour is a product of evolutionary


design, so Dennett suggests, are our more sophisticated representational


abilities.


Many philosophers draw a distinction between derived and original (or


intrinsic) intentionality. Pictorial and written representations have derived


intentionality � their content is derived from the intentions of their human


users and producers. The content of our thoughts, by contrast, is not derived,


but intrinsic to them. Dennett again demurs. All intentionality, he argues,


external and internal, is derived:


a merely mental image of your mother � or Michelle Pfeiffer � is about its


object in just as derived a way as the sketch you draw. It is internal, not


external, but it is still an artefact created by your brain and means what it does


because of a particular position in the ongoing economy of your brain's


internal activities and their role in governing your body's complex activities in


the real, surrounding world. (p. 52)


This immediately raises the question of just what role explicit internal


representations (mental images of words and pictures) have in the mental


economy. In order to shed some light on this question, Dennett adopts an


evolutionary perspective.


He begins by introducing a metaphor: the evolution of the mind, he


suggests, involved a progressive ascent of the Tower of Generate and Test. The


Tower consists of four floors, each of which represents a more efficient way of


solving day-to-day survival problems. Each progressive solution is a �better


move� than the one before. Thus, the ground floor is inhabited by Darwinian


creatures that are blindly generated by natural selection and possess different


hardwired phenotypes. Their responses to survival problems are determined by


their genetic inheritance and are quite inflexible.


3


The second floor is inhabited by Skinnerian creatures. These can vary their


phenotypic response to the environmental contingencies they encounter.


Skinnerian creatures also possess hardwired reinforcement mechanisms that


bias them to make what Dennett terms �Smart Moves�. A Skinnerian creature


will vary its response to stimuli until something good comes of it, whereupon it


will become conditioned to produce the same response again should similar


stimuli be encountered. Such conditioning is possible, of course, only if the


initial response is not fatal.


Popperian creatures, who inhabit the third floor, run less risk of making fatal


first moves. These creatures have an inner environment � a mental representation


of the external world � and can run internal simulations of various courses of


action. In this way, they can calculate the likely effects of candidate actions and


eliminate the ones likely to have undesirable consequences � thus �permitting


their hypotheses to die in their stead� as Karl Popper puts it.


Popperian creatures are much smarter than their Skinnerian cousins.


However, their ability to form and test hypotheses is still limited by their


genetic endowment. Their representational abilities, in particular, may remain


relatively encapsulated, so that information from one domain is not routinely


made available for the solution of problems occurring in others. Gregorian


creatures, who live on the next floor, are smarter yet. They supplement their


innate problem-solving abilities with mind tools acquired from their peers. They


have learned Richard Gregory�s lesson that tools not only display intelligence,


but create it too. A well-designed tool meshes with our native abilities and


extends them in new and far-reaching ways. (Think, for example, of how a pair


of scissors extends our ability to manipulate and shape artefacts.) The mind


tools which Gregorian creatures possess are culturally transmitted tricks,


shortcuts, and strategies which enable them to arrive more swiftly at Smart


Moves for solving problems. The most powerful of these mind tools, Dennett


suggests, are words.


This picks up themes from Andy Clark's recent work. Clark argues that


language augments existing computational abilities in several ways. For


instance, language enables us to make plans and coordinate our actions. If you


tell a friend you will meet at 1 p.m. on Monday for lunch, then your friend can


time her other activities accordingly. Language also permits the mental


rehearsal of self-directed commands and exhortations in order to focus


behaviour, aid recall, and reinforce learning. Clark writes:


The role of public language and text in human cognition is not limited to the


preservation and communication of ideas. Instead, these external resources


make available concepts, strategies and learning trajectories which are simply


not available to individual, un-augmented brains. Much of the true power of


4


language lies in its underappreciated capacity to reshape computational spaces


which confront intelligent agents. (Clark 1995 p. 18)


We are Gregorian creatures and the manipulation of words and other mind


tools (memes as Dawkins calls them) is the distinctive mark of human mentality.


But it is not a mark of mentality as such; mind tools are just another of Mother


Nature�s strategies for keeping us ahead in the complex game of survival, and


they derive their content from their role in these strategies.


All of this is very attractive, and is well motivated by a desire to rid


cognitive science of Cartesian and essentialist thinking. Dennett provides us


with a good structure to pin our speculations upon, a way of avoiding the gritty


problem of intrinsic intentionality, and some account of how the human mind


differs from that of the not-so-flexible Popperian Creatures. All of this seems


to provide a potentially fruitful frame for future empirical work.


But the lay-person, picking up this Science Masters publication, may be left


with a nagging doubt. �Granted�, they may say, �the reasons that motivate frogs


and bats and cats are free-floating, not intrinsic; but surely ours aren't? We don't


just act for reasons, we act on them.� This is a legitimate worry, and it suggests


that Dennett's model of derived intentionality is too loose as it stands. It is all


very well to argue that our representational mechanisms have been selected for


because they confer adaptive advantages, but the lay reader may wonder why


these mechanisms do not give us a kind of intrinsic intentionality, qualitatively


different from the free-floating rationales of more primitive creatures. And


even if intrinsic intentionality is illusory, as Dennett claims, it would be nice to


have some explanation of why we dupe ourselves into believing in it.


The more expert reader, looking to see how the land lies within cognitive


science, may end up asking the same question. Dennett�s speculations on the


role of natural language in the Gregorian mind may raise hopes of a more


substantial solution to the problems of human intentionality. Might not our


natural language abilities be the origin of higher-order thinking and the source


of real, cognitively potent mental representations? Such hopes are dashed if


language functions merely as a tool, facilitating and co-ordinating low-level


tasks.


The later chapters of the book are less focused, but rewarding nonetheless.


Here Dennett continues his exploration (begun in his Consciousness Explained) of


the idea that the high-level structure of the human mind is determined by


cultural programming rather than biology. The human mind, he suggests, is


highly organized community of low-level mechanisms in complex dynamical


co-dependence with its environment. (He notes, for example, how some old


people need their familiar home environment in order to function cognitively.)


The final chapter is a vigorous and salutary polemic against woolly-minded


5


anthropomorphism. The further reading section at the end is a wildly eclectic


mix of stuff, some of which would baffle the general reader.


Those who have read Dennett�s weightier tomes will learn little from this


book, but anyone unfamiliar with his work will find it stimulating, provoking,


and illuminating. As usual, Dennett�s avuncular tone makes for an easy read.
Daniel C. Dennett - Personal Name
153 DENN
465073514
eBook - PDF
Perseus Publishing
1997
184
LOADING LIST...
LOADING LIST...