Of course, not having to find it inside the physical chemistry of the brain solves lots of problems. The physical existence of the so called light body does it all quite nicely and must not only be using our set of binary operators but also binary operators based on a quadratic domain informed through our DNA..
We have yet to back engineer the light body but that will be interesting as we do advanced simulation of the electron itself.
All good..
Yes, Your Brain Certainly Is a Computer
Thursday, May 19, 2016
http://recursed.blogspot.ca/2016/05/yes-your-brain-certainly-is-computer.html
- Did you hear the news, Victoria? Over in the States those clever Yanks have invented a flying machine!
- A flying machine! Good heavens! What kind of feathers does it have?
- Feathers? It has no feathers.
- Well, then, it cannot fly. Everyone knows that things that fly have feathers. It is preposterous to claim that something can fly without them.
OK, I admit it, I made that dialogue up. But that's what springs to mind when I read yet another claim that the brain is not a computer, nor like a computer, and even that the language of computation is inappropriate when talking about the brain.
The most recent foolishness along these lines was penned by psychologist Robert Epstein. Knowing virtually nothing about Epstein, I am willing to wager that (a) Epstein has never taken a course in the theory of computation (b) could not pass the simplest undergraduate exam in that subject (c) does not know what the Church-Turing thesis is and (d) could not explain why the thesis is relevant to the question of whether the brain is a computer or not.
Here are just a few of the silly claims by Epstein, with my commentary:
"But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently."
-- Well, Epstein is wrong. We, like all living things, are certainly born with "information". To name just one obvious example, there is an awful lot of DNA in our cells. Not only is this coded information, it is even coded in base 4, whereas modern digital computers use base 2 -- the analogy is clear. We are certainly born with "rules" and "algorithms" and "programs", as Frances Crick explains in detail about the human visual system in The Astonishing Hypothesis.
"We don’t store words or the rules that tell us how to manipulate them."
-- We certainly do store words in some form. When we are born, we are unable to pronounce or remember the word "Epstein", but eventually, after being exposed to enough of his silly essays, suddenly we gain that capability. From where did this ability come? Something must have changed in the structure of the brain (not the arm or the foot or the stomach) that allows us to retrieve "Epstein" and pronounce it whenever something sufficiently stupid is experienced. The thing that is changed can reasonably be said to "store" the word.
As for rules, without some sort of encoding of rules somewhere, how can we produce so many syntactically correct sentences with such regularity and consistency? How can we produce sentences we've never produced before, and have them be grammatically correct?
"We don’t create representations of visual stimuli"
-- We certainly do. Read Crick.
"Computers do all of these things, but organisms do not."
-- No, organisms certainly do. They just don't do it in exactly the same way that modern digital computers do. I think this is the root of Epstein's confusion.
Anyone who understands the work of Turing realizes that computation is not the province of silicon alone. Any system that can do basic operations like storage and rewriting can do computation, whether it is a sandpile, or a membrane, or a Turing machine, or a person. Today we know (but Epstein apparently doesn't) that every such system has essentially the same computing power (in the sense of what can be ultimately computed, with no bounds on space and time).
"The faulty logic of the IP metaphor is easy enough to state. It is based on a faulty syllogism – one with two reasonable premises and a faulty conclusion. Reasonable premise #1: all computers are capable of behaving intelligently. Reasonable premise #2: all computers are information processors. Faulty conclusion: all entities that are capable of behaving intelligently are information processors."
-- This is just utter nonsense. Nobody says "all computers are capable of behaving intelligently". Take a very simple model of a computer, such as a finite automaton with two states computing the Thue-Morse sequence. I believe intelligence is a continuum, and I think we can ascribe intelligence to even simple computational models, but even I would say that this little computer doesn't exhibit much intelligence at all. Furthermore, there are good theoretical reasons why finite automata don't have enough power to "behave intelligently"; we need a more powerful model, such as the Turing machine.
The real syllogism goes something like this: humans can process information (we know this because humans can do basic tasks like addition and multiplication of integers). Humans can store information (we know this because I can remember my social security number and my birthdate). Things that both store information and process it are called (wait for it) computers.
"a thousand years of neuroscience will never locate a representation of a dollar bill stored inside the human brain for the simple reason that it is not there to be found."
-- Of course, this is utter nonsense. If there were no representation of any kind of a dollar bill in a brain, how could one produce a drawing of it, even imperfectly? I have never seen (just to pick one thing at random) a crystal of the mineral Fletcherite, nor even a picture of it. Ask me to draw it and I will be completely unable to do so because I have no representation of it stored in my brain. But ask me to draw a US dollar bill (in Canada we no longer have them!) and I can do a reasonable, but not exact job. How could I possibly do this if I have no information about a dollar bill stored in my memory anywhere? And how is that I fail for Fletcherite?
"The idea, advanced by several scientists, that specific memories are somehow stored in individual neurons is preposterous"
-- Well, it may be preposterous to Epstein, but there is at least evidence for it, at least in some cases.
"A wealth of brain studies tells us, in fact, that multiple and sometimes large areas of the brain are often involved in even the most mundane memory tasks."
-- So what? What does this have to do with anything? There is no requirement, in saying that the brain is a computer, that memories and facts and beliefs be stored in individual neurons. Storage that is partitioned in various location, "smeared" across the brain, is perfectly compatible with computation. It's as if Epstein has never heard of digital neural networks, where one can similarly say that a face is not stored in any particular location in memory, but rather distributed across many of them. These networks even exhibit some characteristics of brains, in that damaging parts of them don't entirely get rid of the stored data.
"My favourite example of the dramatic difference between the IP perspective and what some now call the ‘anti-representational’ view of human functioning involves two different ways of explaining how a baseball player manages to catch a fly ball – beautifully explicated by Michael McBeath, now at Arizona State University, and his colleagues in a 1995 paper in Science. The IP perspective requires the player to formulate an estimate of various initial conditions of the ball’s flight – the force of the impact, the angle of the trajectory, that kind of thing – then to create and analyse an internal model of the path along which the ball will likely move, then to use that model to guide and adjust motor movements continuously in time in order to intercept the ball.
"That is all well and good if we functioned as computers do, but McBeath and his colleagues gave a simpler account: to catch the ball, the player simply needs to keep moving in a way that keeps the ball in a constant visual relationship with respect to home plate and the surrounding scenery (technically, in a ‘linear optical trajectory’). This might sound complicated, but it is actually incredibly simple, and completely free of computations, representations and algorithms."
-- This is perhaps the single stupidest passage in Epstein's article. He doesn't seem to know that "keep moving in a way that keeps the ball in a constant visual relationship with respect to home plate and the surrounding scenery" is an algorithm. Tell that description to any computer scientist, and they'll say, "What an elegant algorithm!". In exactly the same way, the way raster graphics machines draw a circle is a clever technique called "Bresenham's algorithm". It succeeds in drawing a circle using linear operations only, despite not having the quadratic equation of a circle (x-a)2 + (y-b)2 = r2 explicitly encoded in it.
But more importantly, it shows Epstein hasn't thought seriously at all about what it means to catch a fly ball. It is a very complicated affair, involving coordination of muscles and eyes. When you summarize it as "the simply needs to keep moving in a way that keeps the ball in a constant visual relationship with respect to home plate and the surrounding scenery", you hide all the amazing amount of computation and algorithms that are going on behind the scenes to coordinate movement, keep the player from falling over, and so forth. I'd like to see Epstein design a walking robot, let alone a running robot, without any algorithms at all.
"there is no reason to believe that any two of us are changed the same way by the same experience."
-- Perhaps not. But there is reason to believe that many of us are changed in approximately the same way. For example, all of us learn our natural language from parents and friends, and we somehow learn approximately the same language.
"We are organisms, not computers. Get over it."
-- No, we are both organisms and computers. Get over it!
"The IP metaphor has had a half-century run, producing few, if any, insights along the way."
-- Say what? The computational model of the brain has had enormous success. Read Crick, for example, for an example of how the computational model has had some success in modeling the human visual system. Here's an example from that book I give in my algorithms course at Waterloo: why is it that humans can find a single red R in a field of green R's almost instantly whether there are 10 or 1000 letters, or a single red R in a field of red L's almost as quickly, but has trouble finding the unique green R in a large sea of green L's and red R's and red L's? If you understand algorithms and the distinction between parallel and sequential algorithms, you can explain this. If you're Robert Epstein, I imagine you just sit there dumbfounded.
Other examples of successes include artificial neural nets, which have huge applications in things like handwriting recognition, face recognition, classification, robotics, and many other areas. They draw their inspiration from the structure of the brain, and somehow manage to function enormously well; they are used in industry all the time. If that is not great validation of the model, I don't know what is.
I don't know why people like Epstein feel the need to deny things for which the evidence is so overwhelming. He behaves like a creationist in denying evolution. And like creationists, he apparently has no training in a very relevant field (here, computer science) but still wants to pontificate on it. When intelligent people behave so stupidly, it makes me sad.
P. S. I forgot to include one of the best pieces of evidence that the brain, as a computer, is doing things roughly analogous to digital computers, and certainly no more powerful than our ordinary RAM model or multitape Turing machine. Here it is: mental calculators who can do large arithmetic calculations are known, and their feats have been catalogued: they can do things like multiply large numbers or extract square roots in their heads without pencil and paper. But in every example known, their extraordinary computational feats are restricted to things for which we know there exist polynomial-time algorithms. None of these computational savants have ever, in the histories I've read, been able to factor arbitrary large numbers in their heads (say numbers of 100 digits that are the product of two primes). They can multiply 50-digit numbers in their heads, but they can't factor. And, not surprisingly, no polynomial-time algorithm for factoring is currently known, and perhaps there isn't one.
No comments:
Post a Comment