Is the Brain Just a ‘Wet Computer’? – Part one

Edi Bilimoria – UK

Medley Brain 2 Just a Wet Computer
The author

It should be clear to anyone who can think that whereas the brain does display some of the mechanical functions and characteristics of a digital computer, then to declare as the majority of mainstream neuroscientists do that the brain is nothing but a ‘wet’ computeri is patently ludicrous (as ludicrous as saying that just because a concert pianist displays some characteristics of an office typist – the use of fingers on a keyboard – that a pianist and a typist, are one and the same thing, or a piano and a typewriter are the same instrument because they both have keyboard.) For a start, it is minds and brains that created and produced computers, not the other way round. The product stands hierarchically on a lower plane than the producer of the product. Brains therefore must stand hierarchically at a higher level of sophistication and subtlety than computers. The fallaciousness of equating the brain with just a computer has been pointed out in no uncertain terms by some of the world’s greatest philosophers, psychologists, as well as scientists, such as David Gelernter, professor of computer science at Yale University. In his article, appropriately titled The Closing of the Scientific Mind,ii he demolishes what he appropriately calls the ‘master analogy’ unquestionably accepted by the vast majority of mainstream scientists that minds are to brains as software is to computers; to put it another way, the mind is the software of the brain.

This is the foremost view about the mind amongst mainstream scientists and since this idea is now so engrained it would be instructive first to review in some depth their arguments in support of their master analogy, also known as computationalism or cognitivism, before exposing the fatal weaknesses in the analogy. What follows is an elucidation and considerable amplification of the salient points in Gelernter’s article.

The Master Analogy: Brain is Just a Computer

What is a computer? It may be rather surprising to ask what appears to be an obvious question. But the word ‘computer’ is now so commonly used that the meaning of the term has become lost in the mass of popular association attaching to it. Here is a list of the most authentic definitions of a computer:

  • A usually electronic device for storing and processing data, (usually in binary formiii), according to instructions given to it in a variable program – Concise Oxford English Dictionary, ninth edition.

  • An electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program – US English Dictionary

  • An electronic computer in which the input is discrete rather than continuous, consisting of combinations of numbers, letters, and other characters written in an appropriate programming language and represented internally in binary notation – British Dictionary for definition of a digital computer

  • A machine that stores programs and information in electronic form and can be used for a variety of processes… – MacMillan Dictionary

All of these definitions have one thing in common: words like ‘according to instructions given to it’, ‘written in an appropriate programming language’, ‘that stores programs and information’ all make it conspicuously obvious that a human programmer is involved. The computer cannot program itself on its own. Computers and machines do only and precisely what humans program them to do.

Notwithstanding this, one of the arch-champions of this idea that the brain is no different in principle – in fact, is a computer – is Daniel Dennett. In his highly influential book Consciousness Explainediv (a better title would be Consciousness Explained Away) he asks us to ‘think of the brain as a computer’. For Dennett, ‘human consciousness can best be understood as the operation of a “von Neumannesque” virtual machine’; which means that human consciousness is a software program or application designed to run on any ordinary computer such as the brain – hence the reference to John von Neumann accredited with inventing the digital computer, and a thoroughly insulting reference at that to the great Hungarian mathematician who never maintained that minds are to brains as software is to computers – see footnote belowv on von Neumann’s feeling for divinity. In a limited sense the analogy is fitting. The reason being that software comprises coded instructions given to hardware. We can dismantle and dissect the hardware with a scalpel and view it under a microscope if we feel like it, but we cannot dissect software to find out the mathematical code or the program, or the software programmer. The structure of software and hardware are wholly different, albeit the former is embodied by the latter (without hardware, software would have no significance, and vice versa – the one depends upon the other).

So far so good, but this idea of embodiment of an entirely different structure is extrapolated to the notion of mind embodied by brain. It is argued (reasonably upon first appearance) that the brain has its own structure and so does the mind, which exhibits reason, memory, imagination, emotions, and happens to be ‘conscious’ (whatever that term may mean to materialists). The structure of the mind cannot be dissected with scalpels or seen through a microscope or revealed by MRI scans, but the structure of the brain can indeed be so dissected and seen because the brain is a dense mass of physical matter like neurons and other cells. Yet the mind cannot exist apart from the brain which wholly embodies it. Therefore minds are to brains as software is are? to computers; and minds cannot exist without brains just as software cannot exist without hardware. Put another way, without the associated hardware in each case, minds and software are mere abstractions.

Some computationalists take this notion to extremes. For example, Ray Kurzweil (who works, unsurprisingly, for Google) predicts that by 2029 computers will outsmart us and that the behaviour of computers will be indistinguishable from that of humans. (In fact computers do outsmart us even today – in speed of number crunching if nothing else – but that is because human beings have designed and programmed them to do so.) And after the year 2045, machine intelligence will dominate human intelligence to such an extent that men will no longer be capable of understanding machines.vi By then men will have begun a process of machinization, or what he terms ‘transhumanism’: the merging of men and machines by cramming their bodies and saturating their brains with semiconductor chip implants, along with the fine tuning of their genetic material. Then, Drew McDermott at Yale University believes that biological computers (meaning the human brain) differ only superficially from modern digital computers. He goes on to assert that according to science, humans are just a strange animal that arrived pretty late on the evolutionary scene, that ‘computers can have minds’ and that his avowed purpose is ‘to increase the plausibility of the hypothesis that we are machines and to elaborate some of its consequences’.vii (A strict syllogism would also mean that animal brains also equals computers.)

Now all this could be dismissed as the fantasies of nerds. But we cannot do so because their ideas have gained such prominence and are highly pernicious. We need to re-cover our humanness at all costs and the warviii against man-equals-computer has to be fought in earnest. Therefore we need to unearth the fallacies in their predictions. But first, out of fairness to the computationalists, we need to understand exactly what they are contending.

The Master Analogy: What It Purports to Show

In summary then, the whole essence of the problem is the failure to perceive the distinction between the machine and the mechanic, between the instrument and the performer: or whatever similar such metaphor we may ascribe to the confusion caused by the blurring of categories. No one could have expressed the barrenness of materialistic theories about mind and consciousness better than the great neurologist Sir Francis Walshe.ix Identifying the naiveté of identifying the mechanism with its informing principle he writes:

From sheer psychological and philosophical necessity.. [there is the] existence in man of an essential immaterial element…psyche, entelechy, anima or soul…setting him above the merely animal.

It has also to be recognized that for the soul’s functioning as an essential element in the hylomorphic human person, it needs some data, of which the brain is the collecting, integrating and distributing mechanism. Yet it would be quite childish to identify the instrument with its user, even though the user be dependent upon the instrument for operating’.

Then in the same article he identifies the root cause of the problem, stating that:

We shall have to accept the ancient concept of the soul again: as an immaterial, incorporeal part of the human person, and yet an integral part of his nature, not just some concomitant aspect of man, but something without which he is not a human person …There is a sense in which the present is an age of which characteristic is its failure to understand the status of its own abstractions, and this, perhaps, is the inevitable fruit of the divorce of natural science from metaphysics [and from religion and mysticism we might add] to have achieved which was the empty triumph of the nineteen century’.

He then comments on the immaturity of reducing the human being to a machine and the mind to a virtual machine

For me, the chill physico-mathematical concept of the human mind is a muddy vesture of decay in which I am not willing to be enfolded. It is unworthy of the dignity of Man’.

Not for Francis Walshe then is the mind a von Neumannesque machine. Finally he demolishes the charge that such an attitude in unscientific by stating:

And if any say that this is not a scientific attitude I am unmoved by the irrelevance, for, outside its proper field of discourse, the word “science” does not intimidate me. Man was not made for science, but science by man, who remains more and greater than his creations.’x

In fact the true test of a scientist is his ability and humility to see the limits of his theories and to work within them and not push them into areas in which they are patently incapable of operation or explanation. How many scientists these days are able to say ‘I don’t know, but I’ll try and find out’? Unfortunately the majority of them seem incapable of recognizing (let alone acknowledging) the limits of science and their own limitations and are extremely fearful of not being able to explain away literally everything from divinity to cosmos in materialistic terms.

So regarding the human being, their strategy is to eliminate, or reduce to the merely physically observable and measurable all subjectivity; and that means consciousness and feelings. Precisely because feelings and subjectivity are incompatible with the machine paradigm of man equals computer, once subjectivity is eliminated the case for the computer-mind is strengthened. And once the mind is reduced to a computer, all sense of personal responsibility, our pangs of conscience, our feeling for divinity and higher aspiration – all to do with being truly human – are eradicated or explained away at one stroke.

Just ignore everything that distinguishes man from a computer – and man is just a computer and nothing more.

This strategy is elaborated below by explaining how the three-pronged attack against subjectivity is done by way of the interrelated arguments of: roboticism or zombism, functionalism and brain states.

Roboticism

The term is an apt one as Richard Dawkins has described human beings as ‘lumbering robots’ – which hopefully also includes himself in the epithet. In essence the arguments goes like this. With the current increase in technology, it is not difficult to imagine at time when a robot could be constructed with all the needed software to display all the characteristics of someone you know, say your best friend. You may ask him about his feelings, or whether he has consciousness or indeed whether he is human, and he would answer ‘yes’ to all of these. But the answer is merely the clever software programing that has been built into him. There is no way of telling that he is actually feeling anything or experiencing anything at all. He is, in short indistinguishable from the human you take yourself to be.xi But why assume that you are human? Why aren’t you just like your lumbering robotic friend? After all, what is the point of consciousness? Has not Darwinian theory fully explained that nature selects the fittest creatures based on entirely practical grounds like survivability? If your robotic friend and you behave in the same way, what survivability purpose does consciousness serve?

Functionalism

The important point to note here is that subjectivity has been reduced to the physical, therefore the observable and measurable. As an example take a piece of music that can make one tearful, such as Schubert’s song Ave Maria. What does being tearful mean according to functionalism? It means that a certain set of physical events (a CD playing, sound waves in the air from the speakers, the action of the salty glands in the eyes) but not actually crying, cause the state of mind known as tearful. This state of mind (along with others) makes you want to do certain things, like shedding tears. So ‘I want to cry’ means that the mental state (tearful) has not been eliminated but reduced to just certain physical circumstances, to what one has been doing and what actions one plans to do (like putting on a CD of the Ave Maria in anticipation of being tearful).

It is no good arguing that one can be reduced to tears without any physical events such as a live performance or a CD recording – simply by imagining the song. After all, MRI scans can show the change in brain states when we experience feelings even if there is no external stimulus. It is simply a matter of biochemical activity like neurons and neurotransmitters that make us tearful. To reiterate, the thorny problem of the relationship and crossing over between physical brain states and subjective experience is neatly solved (meaning explained away) not by eliminating the latter, but reducing it to the physically observable and measurable. Once subjective states are so eliminated or dispensed with then what is the difference between a man’s brain and a computer? Can computers have feelings? You see colour (say, red) on your computer screen, but how foolish to think that your computer actually experiences red! Colour on your computer LCD is produced by careful control and variation of the voltage applied over the subpixels. It take an enormous number of semiconductor transistors to produce the whole palette of colours on your computer. Colour in your head involves an enormous number of neurons firing under precise electro-chemical conditions. So what’s the difference in principle? None at all, as the translation of physical brain states to mental states and subjective experience is neatly solved – by eliminating consciousness and feeling or regarding them as being superfluous or illusory (epiphenomenon being the technical term which conveys the mask of scientific respectability but in point of fact camouflages the underlying sterility of the ideology).

Eliminate feelings and subjective states and man is no different from a computer. Can a computer feel anything?

But wise Schrodinger saw deeper. This extract is highly apposite:

We do not belong to this material word that science constructs for us. Science cannot tell us a word about why music delights us, of why and how an old song can move us to tears? Science can, in principle, describe in full detail all that happens in our sensorium and motorium from the moment the waves of compression and dilation reach our ear to the moment when certain glands secrete a salty fluid that emerges from our eyes. But the feelings of delight and sorrow that accompany the process science is completely ignorant—and therefore reticent.

Similarly insightful thoughts appear in a recent book by Mind in Cosmos.xii The subtitle to the book Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False would be guaranteed to produce howls of protest which is precisely what happened. It is irrelevant that Nagel is a distinguished professor of philosophy at New York University, or that he presents his case meticulously, without any reference to religion or for that matter, attacking religion, or that he then suggests some of his own ideas on consciousness tentatively in the tradition of good science. His case is quite simply as his subtitle states, why Darwinian evolution is insufficient to explain the emergence of consciousness; and the inadequacies of the mainstream dictum rooted in the Darwinian paradigm on the workings of the mind. But even to question Darwin politely is seen by the cognoscente of science as an attack upon the omniscient god of evolutionary theory. To risk doing so provokes the same hysterical outlash as to attack the god of religious fundamentalists – the very god that these materialists heartily decry. Here we have to draw a clear distinction between Darwin’s ideas presented by way of propositions and the Neo-Darwinians who have elevated Darwin’s ideas to ‘biblical’ status. To question molecular mechanisms or Darwin then, is tantamount to questioning god and the excommunication by the scientific fraternity (church of scientism) is severe. Sober scientists are not physically burnt at the stake but the personal abuse and the backlash against their careers is no different in principle. A fine example of the sacredness of molecular science is the editorial that appeared on the front page of Nature under the title ‘A book for burning?’ The book in question was of course A Science of Life by Rupert Sheldrake, which the editor Sir John Maddox denounced in a savage attack saying that ‘even bad books should not be burned; works such as [Hitler’s] Mein Kampf have become historical documents…His [i.e. Sheldrake’s] bookxiii is the best candidate for burning there has been for many years’xiv Moreover, in a BBC interview, Maddox said ‘Sheldrake is putting forward magic instead of sciencexv, and that can be condemned in exactly the language that the Pope used to condemn Galileo, and for the same reason. It is heresy.’xvi It is worth pausing at this juncture and fully taking in the import of all this. A leading man of science to whom a knighthood was conferred, also a Fellow of the Royal Society and editor of one of the leading scientific journals is recommending in all seriousness that a book by the Nazi leader should be retained owing to its historical value ‘for those concerned with the pathology of politics’, as he puts it, but that Sheldrake’s book, written by an eminent man of science, should be eradicated. However much Maddox chose to revile its content, why shouldn’t Sheldrake’s book also be of value for those concerned with the pathology of science?

Enough has been said to demonstrate the power of prejudice and bigotry; and that those deemed as heretics whether in religion or in science are dealt with by the established church (of religion or scientism) in essentially the same way – burning and ostracization. But there are glorious exception such as Brian Josephson, a Nobel laureate who wrote a repo site in Nature staring that the editor ‘show[ed] a concern not for scientific validity but for respectability’, and that ‘the fundamental weakness is a failure to admit even the possibility that genuine physical facts may exist which lie outside the scope of current scientific descriptions.’xvii Nothing then, exposes better the weakness of fundamentalists of the Neo-Darwinian camp than when they resort to mass attack and expletivesxviii against someone who proposes sensible countered arguments, often painful truths that shatter their ideological stronghold and the publicity and research grants that all go with it. Similar calumny and insult was heaped upon Blavatsky who also showed with painstaking detail that Darwinian evolution was not wrong as such, but wholly incomplete to account for evolution without invoking the grand occult doctrines that she transmitted through her magnificent works.

Brain States

This argument is really a variant of functionalism. It posits that changes in our emotions and feelings are caused solely by changes in brain states. Our minds are simply the product of our genes and brain, so to destroy or damage a part of the brain is to affect the personality trait corresponding to the damaged region. This contention is based on the grounds of considerable data and research showing how persons who have suffered brain damage through injury or strokes display marked changes in emotional behaviour and personality after their injury. Indeed there are numerous learned books and peer reviewed scholarly articles in international journals in support of this contention.xix An arbitrary review of journals like Neuropsychological Rehabilitation, Journal of Consulting and Clinical Psychology and Annals of Neurology reveal some common features. That injury, or damage through stroke to the brain, especially the part that controls emotion and behaviour – the frontal lobe and limbic system – can alter the way the victim expresses or feels emotion and can confer a variety of emotional problems or motivational disturbances. The victim can have difficulty in controlling his emotions or ‘mood swings’. Whereas some may experience emotions (such as anger) intensely and quickly but with minimal lasting effect, others display what is known as emotional lability, i.e. being on an ‘emotional rollercoaster’, whereby they can be sad, happy and angry in quick succession. There can also be sudden episodes of crying or laughing sometimes without any apparent connection between the emotional expression and the situation in question (such as crying without feeling sad or laughing without feeling happy or laughing at a sad story).

We take absolutely no issue with all this. But we seriously question the unwarranted inference drawn that there is no subjective state and that changes in brain states (i.e. physical mechanisms and processes in the brain) are the sole determining factor affecting our emotions and feelings, somewhat like, to use a crude analogy, a motor car manoeuvring solely according to the steering mechanisms and wheels. Damage the steering mechanism and the car would turn corners awkwardly: damage the brain and emotions are affected because specific personality traits and emotions are governed by specific centres in the brain. So, it is argued, there can be no subjective individual to experience the emotion any more than there is an inner ‘car soul’ to ‘feel’ any change in its steering mechanism. You are quite simply the way you behave corresponding to how your brain states respond to external physical stimuli just as your car behaves on the road according to the way its mechanisms respond to the driver (who in any case, is just another kind of machine). Applying this logic, then, any deprivation or non-functioning of the human sensory apparatus would not register any feelings or emotion associated with the corresponding sensory input. So a blind person could not see and a deaf person could not hear; hence they would not feel any emotion that a normally sighted and hearing person would experience from, say, the view of a city from a tall building or a sublime piece of music.

REFERENCES

iWet’ because the hardware is biological rather than constituted of silicon and other physical chemicals.

iii The binary code in a computer is a coding system using the binary digits 0 and 1 to represent a digit, letter, or other character.

iv Dennett, Daniel, Consciousness Explained, 1991.???

v ‘There probably is a God. Many things are easier to explain if there is than if there isn't.’ As quoted in John Von Neumann : The Scientific Genius Who Pioneered the Modern Computer , Game Theory, Nuclear Deterrence and Much More (1992) by Norman Macrae, p. 379.

vi However the Editor-in-chief of the 1st March 2014 edition of The Week sums up the whole thing very neatly. He points out that such predictions are a geek’s pipe dream. Being like a human is not to be human. Sophisticated machine codes and algorithms may provide the former, but never the latter.

vii http://www.commentarymagazine.com/article/the-closing-of-the-scientific-mind/.

viii Hopefully this will be a bloodless war as computers do not yet have a blood supply.

ix Sir Francis Walshe, ‘Thoughts Upon the Equation of Mind with Brain’, Brain – A Journal of Neurology, March 1953.

x Similar sentiments were echoed by Schrodinger.

xi The film Space Odyssey presaged just this by installing HAL 9000 a sentient computer (i.e. with ‘emotional software’) that is every bit humankind's equal, who has full control over the spacecraft. When things go wrong and HAL endangers the crew's lives for the sake of the mission, the astronaut will first have to first overpower the computer.

xii Thomas Nagel, Mind in Cosmos, Oxford University Press, 2012.

xiii Proposing the hypothesis of morphic resonance to explain the characteristic form and organization of nature.

xiv J. Maddox, ‘A Book for Burning?’, Nature 293 (1981): pp. 245-46.

xv Sheldrake was a Scholar of Clare College, Cambridge and was awarded the University Botany Prize. He then studied philosophy and history of science at Harvard University, where he was a Frank Knox Fellow, before returning to Cambridge, and becoming a Fellow of Clare College where he was Director of Studies in biochemistry and cell biology. As the Rosenheim Research Fellow of the Royal Society he carried out research on the development of plants and the ageing of cells at Cambridge.

xvi BBC 2 TV, Heretics, 19 July 1994.

xvii B. Josephson, ‘Incendiary Subjects’, Nature 294 (1981): p. 594.

xviii An obvious example of such behaviour being Richard Dawkins. Unable to understand the symbolism and inner meaning of religious doctrine his only resort is to revile it in sentences such as ‘I have described atonement, the central doctrine of Christianity, as vicious, sadomasochistic, and repellent. We should also dismiss it as barking mad’, Richard Dawkins, The God Delusion , Transworld Publishers, 2006, p. 287.

xix This site for example is especially noteworthy - click here