PDA

View Full Version : Speed of Computers vs. Brain


cac2889
02-19-2003, 03:46 AM
This question is something I have always wanted the answer to. I'm sorry if its in the wrong forum but I could not find a more suitable one to put it in.

Question: How fast is the human brain in terms of Computer speed (for example the fastest consumer computer out there right now is 3.06GHz). Anyone have a clue?

Valkyrien
02-19-2003, 04:06 AM
very very fast.;)

Marc Andreoli
02-19-2003, 04:40 AM
...and parallel :applause:

GregHess
02-19-2003, 05:42 AM
I think someone did an estimation of the amount of visual data the brain processes per second.

They estimated that both overlapping eyes could possibly resolve a one million by one million pixel image. Taking into account the number of cones present in each eye, they downplayed it as a 48 bit color resolution.

The human eye sends visual data to the brain about every 20 milliseconds. So taking this possible data into account...

Thats a huge amount of terabytes every 20 milliseconds, if not higher.

Take into account this information is either

a) Acted upon
b) Stored temporarily
c) Processed and stored by part.
d) Processed and stored by whole.
e) Discarded.

This is of course occuring simultanously as new visual data is incoming.

And this is just for the eyes. Imagine all the secondary input systems, taste, temp, pressure, audio, olfactory, internal diagnostics, pain, pleasure, answering odd questions on cgtalk...etc.

No computer even comes close. Hell, they can barely get a robot to walk on two legs.

kwshipman
02-19-2003, 07:14 AM
I actually beleive that the "average" human brain is actually a bottle neck in the whole process. there are billions of things happening at any given time that we cant comprehend. THat is what causes us to faint sometimes, our brain has overloaded and shuts down temporarly.

I had this customer yesterday that kept getting distracted by this flashing light I had on a button.(every thirty seconds "oh my grandaughter would love that so much", "oh my that light is soo pretty") I dont think that her brain processed at the same rate as some.

Another thought, you can compare measures of input such as sight, touch, and the other sences, but how can one measure the computational power of emotions. When you got your first kiss, all logic flew out the window, and you acted on pure animal instinct. How do you calculate the thought processes when you hear you significant other say "I love you".

loop29
02-19-2003, 07:24 AM
Oh yeah, this is a very impressive 10 : 0 for the brain. And especially most of these data is adopted unconscious. That means that you are able to see and handle data while you are thinking about other related stuff. That makes it possible to do things without the full attention of the brain. Ok, itīs a part of the brain that handles these informations, but still it is a process that allocates different weights to different tasks to give you the freedom to do the jobs of the locomotor system (for example) without really spending brain power to it. Now bear in mind that the brain is able to implement motions/actions that you do a lot (doing sports, calculate math or even 3D/CAD :p but Iīm not really sure about 3D/CAD) can be practiced that much, you donīt even have to think about it one day when you reached a certain perfection.
This is more feature view of PC vs brain, but taking care of the ability to learn of the brain leaves the PC far far far......... behind.


regards

MCronin
02-19-2003, 07:44 AM
I had read an article a few years ago that tried to answer this question. The conclusion they came to was something like 100 million MIPS. By that estimation the average human brain is as capable as about 12,250 AMD XP 3000+ CPUs.Based on Moore's law, that means we'd have personal computers as capable as the average human about 21 years from now.

singularity2006
02-19-2003, 08:18 AM
was it really that low? The last estimate i heard is that a computer would have to run @ a 1000 terahertz to get the speed of a brain. or was it 1000 GHz?

MCronin
02-19-2003, 08:39 AM
Mhz don't matter, we are talking about instructions per second here. Even the !00 million MIPs estimation doesn't give you a good idea of just how sophisticated the human brain is. In addition to processing an obscene ammount of data whether we are concious of it or not, the brain is able to filter data faster than the fastest super computers many times over. Computers are dumb and will work through a job to completion. The human mind is able to decide at any point in the process of sifting through data to disregard the current "job" and move onto something else, and it's able to make branch predictions and multitask better than any piece of silicon we could devise.

Flyby
02-19-2003, 08:51 AM
Originally posted by GregHess
I
And this is just for the eyes. Imagine all the secondary input systems, taste, temp, pressure, audio, olfactory, internal diagnostics, pain, pleasure, answering odd questions on cgtalk...etc.
No computer even comes close. Hell, they can barely get a robot to walk on two legs.

Not so long ago, I watched a documentary about the training of combat helicopter pilots.
Those people are trained to asses different datastreams that come from each eye seperately !!
Having one eye on the instruments and one to fly around with nightvision, combining both to counter obsticals, counter enemy fire and pick out targets...fwew..

So, on a dual screen, if you can model in 3dsmax and read a Siggraph paper about deep shadows on the other screen AT THE SAME TIME, while answering the phone and writing down the text of your prefered song, you might be a good helicopter pilot candidate... :eek:

I often work in "multitasking mode" by having several things open to work on. However I tend to act in a more "serial" burst methode. Focussing for a short period on 1 application, store the info gathered, do another job for a few seconds/minutes, then come back to the first application to continue where i left. But this is no real parallel aqcuirement of information, like those Apache pilots can do...
I guess there is a reason why so few are choosen for the job...:rolleyes:

marc
02-19-2003, 11:56 AM
we are not computers. this thread is an insult. j/k. I have no comments so i thought i'd spam.

gustojunk
02-19-2003, 02:12 PM
Also keep in my that the brain can be overcloked very easily, no jumpers or weird bios flashes. If it runs too hot from overclocking all you need is a cold shower, couple shots of scotch, or a day off. If that does not help, a vacation will fix all those memory leaks and applications hogging your ram.

Greg, any comments on the risk of overclocking this one? what would the failure rate be?

GregHess
02-19-2003, 03:56 PM
Greg, any comments on the risk of overclocking this one? what would the failure rate be?

See...Radical...Zealot...Asylum.

cac2889
02-19-2003, 04:13 PM
All of these answers are absolutely brilliant. I was hoping for a good discussion about where we are now in terms of computing power and how long it would take to make machines that can basically BE humans.

Summing up what everyone said, it looks to me like the biggest breakthrough will be the invention of a computer that can handle more than one job at one (multiple instructions on one bus line) time. It may seem impossible now, but I am waiting for the invention of a computer that can do three or four jobs simultainiously. Talk about power.:drool:

Tommi
02-19-2003, 05:16 PM
Just buy a computer with two or three CPUs... :p :D

cac2889
02-19-2003, 06:23 PM
Originally posted by Tommi
Just buy a computer with two or three CPUs... :p :D

Well...I meant a computer that didn't cost you an arm, leg, your life savings, and half of your manhood.:D :p

danylyon
02-19-2003, 08:43 PM
Originally posted by cac2889
Summing up what everyone said, it looks to me like the biggest breakthrough will be the invention of a computer that can handle more than one job at one (multiple instructions on one bus line) time.

Nope that's just a small piece.. I think the biggest breaktrough will be computers which programm themself.

You can't really compare Brains to computers yet, because the brain doesn't have storage and cpu separated. It does not have RAM a HD or anything like that (In an analoge way). It's basically just one big ever changing "CPU".

Sorry.. hard to explain in a foreign language :hmm:

cac2889
02-19-2003, 09:22 PM
Originally posted by danylyon
Nope that's just a small piece.. I think the biggest breaktrough will be computers which programm themself.

Hmm...very good point danylyon. I think we already beginning to see that kind of thing with Windows 2000 Prof and Windows XP. It simple and small, but remember it has that autocustomize feature that automatically adds and removes icons from your start menu and desktop depending on how often you use them. As for actually programming themselves, that comes extremely close, if not hitting it right on the head, of full fledged human AI (think the matrix). I don't know what I would do if my computer started making its own decisions without asking me first.:p

danylyon
02-19-2003, 09:31 PM
.. I could go to bed right now.. *sigh* (10pm here and still working)

I actually really hate any kind of "microsoft AI".. I stopped using Word after the Version 95.

I don't know how anyone can come along with Word XP. It's a fight for me.

*type*.. *click*.. "no not centered".. *enter*.. "argh.. no list here" .. *delete*.. "wait not all of it".. *type*... "stop I want this written small"..

and so one and so one..

I love my Wordpad :applause:

jsh3d
02-19-2003, 09:39 PM
Originally posted by cac2889
Summing up what everyone said, it looks to me like the biggest breakthrough will be the invention of a computer that can handle more than one job at one (multiple instructions on one bus line) time. It may seem impossible now, but I am waiting for the invention of a computer that can do three or four jobs simultainiously. Talk about power.:drool:

Isn't this what quantum computeing is suposed to do? Or maybe I'm confunsed :shrug:

-Jsh3d

BrainFaucet
02-20-2003, 04:47 AM
The brain is really close... I think nerves pass data at a maximum of 200 mph. that's really friggin slow compared to transistors. What the brain has that processors don't is it's flexibility and parallel architechure. There isn't a central part of the brain that all information must pass through so it can be doing hundreds of tasks without breaking a sweat. Another interesting thing is chemical processing in the brain. computers can only use electricity to pass information from one part to another, but the brain can use both electricity and a number of hormones and chemicals to produce extremely interesting results.

The brain is also capable of programming itself and it can learn new "hardware" without drivers. Some have had plugs implanted in their skull and can control mouse cursors or listen to an audio signal through these plugs. And the doctors didn't have to teach the brain anything... just gave the patients a few months to learn the interfaces themselves.

It's amazing how advanced organic systems have evolved. Imagine trying to program ants that build nests, farm molds, hunt prey, war with other colonies, etc... and those are considered quite simple by animal standards.

elvis
02-20-2003, 07:15 AM
i did a number of philosophy subjects in uni that covered the same topic as this. the downside was none of the philosophy lecturers had a background in IT, which meant arguments were pretty one-sided, but here's what i got out of it:

consider for a minute the human as nothing but a processing machine. now consider its needs as a processing unit:

input: audio, video and nervous (touch/taste/pain etc). a constant stream of all input is needed, and processing methods to match (voice recognition, visual recognition, etc).

output: speach, motion: and full processing to control these.

input/output combined and compared: or commonly referred to as "feedbacK" (i touch a razor, i cut myself, the cut hurst, therefor razors are bad) etc.

storage: reconstructive. humans do not store data like a computer (as in snapshots). human storage is constructive. ie: we store what we need, and reconstruct images from this. try to imagine the last conversation you had with a particular person. you don't remember snapshots or soundbytes, rather you reconstruct the scene in "your mind's eye".

and the big one: pre-learned data. this is where it gets shady. in philosophy you cannot answer a question by "reframing" it (ie: pushing it back to a previous object). asking "how did humans learn to talk" for instance: well, i was taught by my parents. so who taught them? the answer is pretty obvious until you get to the first human being. who taught them to speak? what was langauge before speach?

as you can probably guess, you can talk about this stuff for a year (hell, i did!) and not get anywhere.

human beings have an almost unlimited potential for data storage and interpretation, as well as the ability to inform themselves of change surrounding them (eg: leaning). to "program" a computer to learn is not learning per se. it's merely pushing your knowledge to a computer, which is not learning but merely a snapshot. (the difference between parrot learning and understanding an idea).

the closest thing we have to learning computers at the moment are nueral networks. if any of you have done any programming in either cognitive science, nuerology or neural nets, you'll know how fun this stuff can be. i did two years of cog sci, and it really rocked. punching in a bunch of heuristics ("levels of importance" for use of a not-that-better phrase), and then watching a network make it's own decisions based on those levels is great. even better is watching a network build it's own heuristics based on feedback (see above) to make further decisions later on.

the potential for discussion on any of these topics is almost unlimited. i urge anyone interested in these things to try and study them either personally or at a tertiary level if you can. i can tell you now you won't regret it.

i was lucky to study under 3 of the most brilliant minds in the fields of psychology and cognitive science at the University of Queensland. i sometimes wish i'd taken it further and gone into full time research with these people rather than take the easy option and do the consulting i do now.

Flyby
02-20-2003, 09:24 AM
Interesting point of view, Elvis...

Personally, I'm convinced that the rise of information technology is at the base of a new step in human evolution. More specific an evolution of the human brain.

For thousands of years, the human mind has always been confronted with serial events. For example, most of us have a serial concept of time: we define "time" as past /present or future, or when we try to structure some ideas, we use the serial method using a step by step procedure.
The parallel behavior of humans has been limited to a physical level: being able to pick up something and look elsewhere, f.e.

However, in this beginning age of information technology, we're all starting to feel the pressure of vast amounts of information. And unconsciously, we're desperate looking for a way to handle such overload of information.
The most obvious thing to do is reducing the quantity of information we are prepared to absorb and let a huge amount of information slip away.

But because we're getting more and more submerged into this information overload situation, we, humans try to adapt our selves and are being trained, due to environmental pressure, to deal with multiple information streams in a CONSIOUS way. As those Apache helicopter pilots....

I think it is very possible that within a few hundreds of years a lot more of us will be capable to perform parallel tasks, not only in a unconscious manner , as we do now, but in an active, conscious manner.
I know it may sounds like Darwinism, but those capable of performing parallel conscious tasks will have an advantage edge in the coming information technology age...

elvis
02-20-2003, 11:44 AM
interesting. so as we grow more intelligent, and pool more of our mental resources into technology, our technology gets better which we then use to increase the amount of information at our disposal, in turn forcing us to become more intelligent? it all sounds very plausible.

there's no doubt our lives are very different to those of people 2-3 generations ago. look at how many people these days are in tertiary education. look at how many jobs require much more advanced knowledge and skill of specialist technical tool-sets. listen to any person over the age of 50 bitch about how hectic and fast life is and how it used to be so much slower and calmer! :) everything from the evening news to a chat over coffee has changed as we embrace technology. i can just about see news from any city in any country in the world right now if i want, as well as chat to anyone anywhere (hell, i am right now!). tell my great-grandfather 100 years ago that in a century i could sit infront of a screen and chat to people on the otehr side of the globe, and he would have had a heart-attack. :)

we faced the industrial revolution quite a while ago now, which was much more noticeable. i wonder if the "technological revolution" will ever be real and not more than a science fiction novel. the rate of computer literacy amoungst young kids is quite high these days with schools having computer programs. it's a running joke amoungst anyone in tech support that if someone over the age of 40 rings you for help, ask to talk to their kids! :)

Gyan
02-20-2003, 03:30 PM
Originally posted by Flyby

But because we're getting more and more submerged into this information overload situation, we, humans try to adapt our selves and are being trained, due to environmental pressure, to deal with multiple information streams in a CONSIOUS way. As those Apache helicopter pilots....


Can you give an example from everyday life of parallel sensory processing ?

singularity2006
02-20-2003, 06:45 PM
I work @ a supermarket where I must deal w/ many people and situations all at the same time.

A radical case:

I hear a sound of shattering glass in the beer aisle while I'm in the front office selling a customer a money order. I must rush out with my mind focused on the sound of crashing glass as I run. I must take into consideration obstables in my way including baskets on the floor, the noise of very loud children, and the annoyance of "managerial calls."

And so while mopping up the very very bad smell of "Elephant Beer," a drunkard asks me where the beer is.... in my annoyance i look down @ the beer. He says "oooh!"

And while cleaning the beer I get an administrative call about what to do with the stuff in the warehouse... with the mop in one hand and my cell phone in another, I am conversing with the guy in the back about where to put everything.

After finishing mopping I realize that the beer shelves are short on wine coolers and must immediately get that taken care of. While walking with the hand truck to get the goods, I am paying attention to make sure the floor is dry and that customers are in my way while @ the same time processing the information in my head about how much of what needs to be taken out on to the shelves....

is that enough information overload for you? =D (there's still more actually.... the joys of multitasking)

Valkyrien
02-20-2003, 09:52 PM
wow, we have the same job Singularity2006...:surprised

Gyan
02-21-2003, 02:37 AM
Originally posted by singularity2006
I work @ a supermarket where I must deal w/ many people and situations all at the same time.

is that enough information overload for you? =D (there's still more actually.... the joys of multitasking)

If you were answering my question, then No.

That's not parallel sensory processing. That's rapid switching and prioritizing. In your case, you take a snapshot of other sensory input and deprioritize them since the dropped bottle has the highest focus. Your focus is always on one event/thread, even if it is rapidly switching between other events.

Northchild
02-21-2003, 01:29 PM
The question is like asking if a lamp fixture is faster than electricity. I don't think that comparing raw processing power, if such a thing exists, gives us a decent grasp of what we really want to know. What are we really asking here? :)

Quizboy
06-21-2004, 07:38 PM
I had read an article a few years ago that tried to answer this question. The conclusion they came to was something like 100 million MIPS. By that estimation the average human brain is as capable as about 12,250 AMD XP 3000+ CPUs.Based on Moore's law, that means we'd have personal computers as capable as the average human about 21 years from now.

We don't have to wait 21 years. We've got the processors all around us that are as powerful as the human brain. The human brain!

This thread made a move in the right direction with the discussion of evolving the brain. As for multitasking it's obvious that the brain can do that cold. Just the fact of maintaining heart beat, breathing functions, and all the other bodily functions while we perform whatever functions we want to is multitasking. (but a computer can do this, it runs background processes.) Conscious mulitasking? Walking and talking to take a simple solution. A tad complexer is walking while talking, navigating toward your parked car, and removing keys from your pocket. Everyone can do this with no problem and it's a good example of true multitasking on a daily basis.

How about a quarter back who gets coaching thru headphones? That's also multitasking because he's listening to orders while executing functions on the field. It happens simultaneously, not switching. Another example is a housewife who talks on the phone while cooking. They don't stop talking to stir the soup and add the potatoes. It's simultaneous.

The Apache fighter pilot example is only a situation where they've decided to focus and capitalize on a recognized existing capability of the brain, and hone it to extremely conscious and productive means.

All this brings me to my point which is that the ultimate processing system already exists and the true revolution will be when we figure out more about how to tap into that system and exploit its wealth of processing power.

I envision a future where people will simply rent out their brains for a few years to large companies like Lockheed Martin or Motorola. Much easier way to earn money than studying futilely to keep up with massive leaps in technology. Or, the government will draft the use people's brains in time of war to drive massive automated adaptive war algorithms.

Well take a look at my Machine Flesh Challenge sketches for "Computer Chip Guy" to see what I mean.

heipei
06-21-2004, 08:52 PM
well, although i cant give an estimate about the speed, i can say that i heard a lot of rumors (by a lot of different women).
after them men are probably equipped with what is a single cpu. but women are real multithreading machines, so they have multiple cpus or at least an HT cpu ;)

they for example can be on telephone, watch tv, cook, wash, play with the kids, sometimes even drive, and all that at the same time! men are simpler. they finish a job and then move on to the next.
now, assuming men and women have equally sized brains (hope noone disagrees here ;) men have more processing power for a single task.

im not quite sure what that means. maybe if i was a woman id have told what kind of crap im am writing here over the phone and someone would have prevented me from doing it, but hey, there we go, singlethreaded. ill realize it after i hit the "submit" button

Vushvush
06-21-2004, 09:52 PM
the one thing not enough people deal with is the optimization of the human brain. The connection driven methodology in which it is built, and as a result of it's architecture, how faulty it is.

If you really think about how many mistakes you make in a day, from spelling mistakes (which I'm trying my hardest not to pay attention to as I write this, speeding along throwing any kind of grammer check to the wind <--- see that!), mistaken identity, tripping over something or the other, or flat out mixing up two people's names.

How would you feel if your computer sometimes returned a render where the car was the wrong shade of red, cause it got confused with the car you worked on yesterday? How would you feel if it forgot a certain function and couldn't calculate that photon map? Give it a few minutes... I'm sure it'll remember, but don't put too much preasure on it or it will throw a fit and shut down for a day!

There are huge differences between between the human brain and a glorified calculator! Mind you that none of these things can be judged in MHZ, MIPS, or image resolutions. If you think you're brain truely proccess one 1/100th of the visual input coming in you're kidding yourself! Look at those experiments done where people miss bltantly obvious things right infront of them when they were told to look at this detail off to the side.

It's an interesting discussion, but sadly most of our brains aren't up to the task at the moment. Maybe in 21 years we'll be more adapt at this topic! :)

The Found Vertex
06-21-2004, 10:06 PM
You must take into account that as we make better computers, our education system will become MUCH more advanced. Remember when calculus was taught as a graduate course in many colleges just a few years ago?

Now there are 12 year olds doing pre-calc as a normal study course. I personally learned basic calc and trig at age 8. As humans because more adept at the computer, you must remember even more quickly do we adapt to our own minds.

Not even 50 years ago, it took months or years to assemble a computer. Now it takes hours, even minutes if you are good. This was a not a product of computers getting better, this was a product of the human mind advancing between generations.

You must also take into account, the faster computers are and the more storgage that's allowed, the more resources humans will have access to. The more you have access to, the more chance you have to process and store it. This also brings the human mind even closer.

Then genetics come into play. It has been shown in many studies that the aptitude of a person to learn is genetic. Newer generations will become enourmously more capable than their predecessors as time wears on. Anthropologists have long known that the human mind actually increases in capacity as generations move on. It's not that we are only learning things that allow us to progress to learn more, it is that we are simply, and completely, just smarter.

Take this all into account when discussing the subject, the capacity of the human mind grows just as anything else. And with the extreme ability of the human mind in the current state, the potential for growth is exponentially larger than the potential for growth of less advanced systems.

TheLostVertex
06-21-2004, 11:14 PM
Minutes to build a computer? Bah, to slow. The are people that can do it in under a minute(speed building contest, entertainment for the dull mind)

-Steven

The Found Vertex
06-21-2004, 11:26 PM
Minutes to build a computer? Bah, to slow. The are people that can do it in under a minute(speed building contest, entertainment for the dull mind)

-Steven

Cool. Im coming over to see that, you better have videos!!

-R :) bert

stephen2002
06-22-2004, 12:08 AM
The Brain = A whole lot of specially designed circuits that do one or two things and do them really well. Circuits for visual pattern reconition, sound reconition, etc. Basically these are put an impluse in the input and you get a certan output. They don't to "calculations" like a computer does. It is kinda like speicalized video processing chips, and audio processing chips, and such all strung together and operating just about seamlessly.

Computer = One general purpose circuit surrounded by a few speically designed cirucits. Everything must be converted into something that the general purpose circuit can handle, often at a loss of preformance. Think the difference between a GPU doing your graphics and the CPU doing your graphics.

So, how much power would it take for a computer to emulate what a brain does, a whole heck of a lot. Now if one were to design special circuits (sometimes analog ones) that mimic the brains functions you could get a whole lot closer a whole lot faster.

singularity2006
06-22-2004, 04:52 AM
OooOH... Data, from Star Trek!!! :scream:

colintheys
06-22-2004, 06:29 AM
One important thing to remember is that we are not talking merely about sheer processing power when discussing the 'speed' of the human brain. The 'software' is also excellent beyond current comprehension. For instance, a recent estimate stated that if the brain were to brute force compute the kinematics required for normal movement, the neural matter required would be about the size of an aircraft carrier. However, movement in reality is intelligently estimated and sent to lower systems to produce fast, computationally-simple solutions.

One of the most incredible things in my opinion is the 'hundred-step rule.' I may have the numbers here wrong, but the gist is the same. A neuron firing takes about 2 miliseconds. An average task is completed in 200 ms. That means there can be only 100 serial steps between an average input and output. Next time someone startles you and you whip around and grab their hand, think about how much data was just crunched and how amazingly quickly it was done. Then think that computation was completed in less than 100 serial steps! It probably takes more steps than that to open the start menu...

MadMax
06-22-2004, 06:40 AM
This question is something I have always wanted the answer to. I'm sorry if its in the wrong forum but I could not find a more suitable one to put it in.

Question: How fast is the human brain in terms of Computer speed (for example the fastest consumer computer out there right now is 3.06GHz). Anyone have a clue?

Based on some responses to threads on this forum and others, some peoples brains seem to operate at 1.77mhz.

creative destructions
06-22-2004, 07:06 AM
Here's a test you can do. Imagine the corelle box and see how fast you can do the radiosity calculations in your head. On the topic of Computer vs. Brain, I wouldn't count motor functions as computations, considering every other animal on the planet are more adapt and active we are. They're basically mechanical devices like peripherals for a computer.

http://www.graphics.cornell.edu/online/box/simulated.jpg

colintheys
06-22-2004, 07:08 AM
I wouldn't count motor functions as computations, considering every other animal on the planet are more adapt and active we are. They're basically mechanical devices like peripherals for a computer.er.... I just finished an entire course on movement. :) I assure you that the computations required for movement are most definately real and most decidedly complex! For instance, if you want to move your hand to your mouse, you have to first coordinate your visual coordinate system with your motor coordinate system to just determine the movement that must be made (ie hand 6 inches forward.) Then you must divide that general instruction into thousands of muscle commands which will result in precisely that movement, often balistically, ignoring normal feedback. Inverse kinematics is designed to simulate precisely this division (and it doesn't do a very good job.) Then, of course, there's things like balance. How about Ballet?

Yes, many animals have better motor coordination than we do. However, animals also have brains! Our spinal cord and brainstem compare very closely to those of a cat (on the >90% level.) Rather than say motor systems are the perhiferals, I would say they are the the underlying architecture of the whole system. After all, a computation that results in no movement is essentially useless. The architecture of the brain would suggest that the whole system evolved to allow movement, and each newer, more complex system can be viewed as a more powerful control mechanism for movement. ie the spinal cord controls reflexes in the leg and much of walking (hence why a chicken will run around with its head cut off,) but the brainstem can take over if it needs to. And, of course, if you want, you can voluntarily control your leg at the the highest possible level: the frontal cortex. I'm not saying that thought is useless! Just that from an evolutionary perspective, thought that results in no change of actual, movement-oriented behavior, is useless. Anyway, the point is that motor processing, far from being an extra, is really the core of the whole brain!

Sorry to blither on like that. Just finished that course and needed to get it off my chest, but decided to stop so as not to fill up the forum and make a fool of myself. :)


Quick, what's the square root of 893475.72947? Now compute these ten billion other instructions, so we can move these 5000 triangles around in world space at 60 frames per second. It's just my view, but academics have no use in the real worldConsider that a software issue. Clearly the brain is capable of doing that. It performs scalizzions (technical term) of fast fourier transfers just for vision. Similarly, clearly your computer is capable of doing integrals. However, if you open windows calculator and try to do one in a single step, you might have trouble!
Quick, look towards your bathroom! You keep a 3d mental map stored in your head of nearly every location you have ever visited and can orient yourself to any point on that map you have managed to reference. Where is your toothpaste? Where was your toothpaste when you were 12? You probaly know to the inch... 5000 triangles...? ;)



Based on some responses to threads on this forum and others, some peoples brains seem to operate at 1.77mhz. Yeah, and sometimes I wish they had a "turbo" button I could mash when I got annoyed with them...

creative destructions
06-22-2004, 07:26 AM
You must take into account that as we make better computers, our education system will become MUCH more advanced. Remember when calculus was taught as a graduate course in many colleges just a few years ago?

Now there are 12 year olds doing pre-calc as a normal study course. I personally learned basic calc and trig at age 8. As humans because more adept at the computer, you must remember even more quickly do we adapt to our own minds.

Not even 50 years ago, it took months or years to assemble a computer. Now it takes hours, even minutes if you are good. This was a not a product of computers getting better, this was a product of the human mind advancing between generations.

You must also take into account, the faster computers are and the more storgage that's allowed, the more resources humans will have access to. The more you have access to, the more chance you have to process and store it. This also brings the human mind even closer.

Then genetics come into play. It has been shown in many studies that the aptitude of a person to learn is genetic. Newer generations will become enourmously more capable than their predecessors as time wears on. Anthropologists have long known that the human mind actually increases in capacity as generations move on. It's not that we are only learning things that allow us to progress to learn more, it is that we are simply, and completely, just smarter.

Take this all into account when discussing the subject, the capacity of the human mind grows just as anything else. And with the extreme ability of the human mind in the current state, the potential for growth is exponentially larger than the potential for growth of less advanced systems.
More on this topic, I'm fairly certain our floating point units have degraded or malfunctioned since the birth of the computer. Quick, what's the square root of 893475.72947? Now compute these ten billion other instructions, so we can move these 5000 triangles around in world space at 60 frames per second. It's just my view, but academics have no use in the real world.

Sieb
06-22-2004, 07:47 AM
And don't forget that most humans do all of this while only utilizing about 10% of their brain.

System Idle Process = 90%

:P

creative destructions
06-22-2004, 08:03 AM
er.... I just finished an entire course on movement. :) I assure you that the computations required for movement are most definately real and most decidedly complex! For instance, if you want to move your hand to your mouse, you have to first coordinate your visual coordinate system with your motor coordinate system to just determine the movement that must be made (ie hand 6 inches forward.) Then you must divide that general instruction into thousands of muscle commands which will result in precisely that movement, often balistically, ignoring normal feedback. After all, that's what inverse kinematics is designed to simulate (and not very well.) Then, of course, there's things like balance. How about Ballet?

Na, no way. Even in computers, joints and bones simulations are much simplier than say, raytracing. Three floating point rotational axis for an elbow is hardly computing intensive. Much of the basic motor functions of the humans and animals are fixed to begin with. There are only certain number of ways you can run, only certain ways you can walk, only certain number of ways you can bend your body, etc. It might seem complex, but I would look at the instructions like the ROM of a BIOS. Latency = Reflexes, Bus Speed = Number of repeated movement per minute, like strides of a runner. It forms the basis of the structural system, but not really related to core computation power of the brain. A lot the human motor research were done in the 90s and computing power have multiplied ten folds since then. A robotic arm can easily move faster than our own, and I won't even go into strength, as machines can beat us easily, and as for legs, well machines have wheels. Apples and oranges? Well clearly wheels are superior since it also enables machines to fly. Human are inferior to machines in many ways, the only thing a computer truly lacks is the ability to create. If we can program a machine write its own code. Then that in itself solves AI and foundation of life as we know it. I guess that goes into Quantum Computing.

kinich
06-22-2004, 10:37 AM
hey, how about social skills, interacting with other humans AND other life forms?

everyone here, exept like 2 people, is going into the logic part of it.

how about all those random things that make being human worth it, like a pregnat woman's mood swings? or simply staring randomly at a beutifull sunset? or how about when you think of one thing, say.. oreanges, then you wonder off into.. : mhh, oranges are nice, like apples, that are also fruit, which are sweet.. mhh sweet, like that babe i saw the other day, who had blue ayes, man the sky is blue too... birds fly in the sky, i want to fly in the sky, mhh skyy vodka.. etc... etc...


acroding to me, and my own little buble, the human brain has passes the calculate everything to perfection era, and now it focuses on the more interesting things like mentioned above. but thats just me ;)

i dont think we'll see a machine capable of doing such things...


how about the brain on drugs? eh?

Vushvush
06-22-2004, 03:42 PM
guys, once agan you're arguing the wrong point here. The human brain CANNOT do "what's the square root of 893475.72947? Now compute these ten billion other instructions, so we can move these 5000 triangles around in world space at 60 frames per second" because we lack the appropriate floating point percission.

The human brain is simply allowed to make mistakes, and infact avoid precission. You are allowed to knick the glass of water onto the ground in a clumsy effort to reach for the orange juice. The human brain is simply an analog, unprecise computer. It can't move 5000 triangles in the <<1.2,5.342,1.924>> vector, but it can damn well estimate in no time where the 2794857985948 points that comprise your mouse are and have you reach to it, but mind you, you won't perfectly nestle your hand onto it the first trym but rather roughly slide onto it as you accidentally push it around a bit.

It's a fascinating discussion on one level, but completely irrelivent on another. Unless you're willing to have the 5000 triangles get moved with a rand function attached to every command the computers we have today will never be able to match our so called, "intution."

gr8spangle
06-22-2004, 05:26 PM
The difference comes down to:

A computer can recognize 475 nm as the wavelength of "blue". But does the computer "see blue" ?

creative destructions
06-22-2004, 06:58 PM
guys, once agan you're arguing the wrong point here. The human brain CANNOT do "what's the square root of 893475.72947? Now compute these ten billion other instructions, so we can move these 5000 triangles around in world space at 60 frames per second" because we lack the appropriate floating point percission.

The human brain is simply allowed to make mistakes, and infact avoid precission. You are allowed to knick the glass of water onto the ground in a clumsy effort to reach for the orange juice. The human brain is simply an analog, unprecise computer. It can't move 5000 triangles in the <<1.2,5.342,1.924>> vector, but it can damn well estimate in no time where the 2794857985948 points that comprise your mouse are and have you reach to it, but mind you, you won't perfectly nestle your hand onto it the first trym but rather roughly slide onto it as you accidentally push it around a bit.

It's a fascinating discussion on one level, but completely irrelivent on another. Unless you're willing to have the 5000 triangles get moved with a rand function attached to every command the computers we have today will never be able to match our so called, "intution."
The human brain CANNOT do "what's the square root of 893475.72947?
945.238451

Would you be satisfied if a computer gave you the letter "b" to that question?
Without precision, there is nothing. Everything we've build is to specification.
I don't believe I'm arguing the wrong point here.

I do believe what you call intution is hidden key to organic computing. Imagine the cells that make up your body. They know how to divide, and when to cease dividing. There is a great amount of information stored and processed there that and of of itself.

UrbanFuturistic
06-22-2004, 07:57 PM
Really people, I think a lot of you vastly undersetimate the power of the human brain. As a vastly complex highly parallel and analogue computer it is capable of some of the more astonishing feats of mathematics.

Consider the amount of mathematical computation required to even estimate the path of a ball as it flies on a continuously changing trajectory through the air. Now the manner in which a computer calculates trajectory involves it needing to know the current position, the previous position and the values of gravity etc. and that the brain can extrapolate from just the arc of the object trajectory demonstrates the difference quite effectively

Just to clear up a few issues:

If you consider that any human can look up a function in a book and that a computer has to load specific software every time it wants to perform a function then the human is often better as if they don't have to look up the information every time.

When it is stated that the brain use 10% of its capacity, that's 10% of its capacity at any one time on average. I just feel that needs to be clarified.

The brain actually is split up into different compartments; the visual cortex is at the back while the frontal lobes handle all intellect emotion and personality processing. The hypothalamus is what could be considered to be the equivelant of RAM as it stores short term memory which is then passed on to the long term memory centres of the brain every night when they spend time making connections between new experiences and those of the past. This is what results in dreaming.

So yes, the brain is compartmentalised; when neuroscientists talk of grey matter and white matter they mean the neurological networks that makes up the functioning part of the brain and the nerve bundles that connect it all together and act as trunk lines between parts of the brain.

There is a specific part of the brain dedicated to facial recognition and differentiating which can much more accurately identify an individual than any computer system in the world.

More to the point, how's that real-time voice recognition and translation computer doing? I know there's a project somewhere but I think it's a fair way off achieving its goal.

Also, since when do computers not make mistakes? Sure they make mistakes, have errors, throw a wobbler from time to time. I mean, c'mon, there's a screensaver dedicated to BSODs, kernel panics and guru meditation errors. Besides, since when do computers haveto deal with the massive amount of parallel computations we have to? Get one of those Honda walking robots (impressive as they are) and have it walking on an uneven surface. Not in the demonstration? Probably because, as much computational power as it takes to make a robot walk, climb stairs, adjust its balance on a flat surface it's going to take a lot more power to have it constantly adapt in the way we do.

Pah, I don't know, puny computers :p

regards, Paul

colintheys
06-22-2004, 08:05 PM
Unless you're willing to have the 5000 triangles get moved with a rand function attached to every command the computers we have today will never be able to match our so called, "intution." While this is a very interesting arguement and true to a degree, I should point out that "intuition" is anything but random. The brain is absolutely capable of doing incredibly complicated math very precisely (and does so quite often.) For isntance, if you are show a rotating object, you uderstand it's shape. Now, try to model it. Try to model a face. You KNOW the surface and shape of it precisely and NURBS, SDS, all of the tools we have to attempt to turn out knowledge into reality is not precise enough to satisfy a simple glance from our amazingly complex and precise brain!
Transmissions between neurons are passed via pulses and encoded by rate, not intensity. Transmission speed is costant, so there is basically no error there. The imperfections you describe, such as bumping the orange juice, result from attempts by the brain to simpify the problem. Rather than brute force compute the actual solution, we have very sophisticated heuristics to estimate solutions. Occasionally, they're not accurate enough! However, odds are that you would NEVER make an error grabbing that orange juice if you were not also reading the paper, and thus trying to cut back on the processing time devoted to the simple motion.
However, you're right that computers really are no where near the brain and estimates of how speed compares are truly academic. While the brain is basically a big, fast, well-programmed computer, it makes a pc look like a single transistor. We understand some of the operations of the brain well, but only VERY VERY basic ones. The comparisson is like asking how many of the original mechanical calculator machines would be required to equal a brand new Boxx. Technically, we could match the calculation speed, but they certainly wouldn't exhibit identical behavior

LightFreeze
06-23-2004, 01:28 AM
Hi guys,got to disagree with using physical ability as an example of brain power ask any toddler to catch a ball and you soon realise how much practice our brain needs to start getting things like that right, there`s also the comparison with the physical abilities of things like insects ,how much computation does a bee take to land on a flower on a windy day and how much practice do they get before there kicked out the hive to go look for honey.

How big is a bee brain anyway?

My only theorys for answering the question is that you can`t compare number crunching or retentiveness to intelligence, they dont seem to be related.Unfortunately I`m full of not this or not that but dont have an answer for how to measure brain speed or power although speed will be finite, power on the other hand ...

creative destructions
06-23-2004, 01:54 AM
Well in the end, people live and die, but computers can go on partically forever. So, Computers 1 = Brain 0. Let's not go into the morbidity of life and death by discussing this any further. Computers will always have its purpose. They're handly, and useful. They create jobs. If somebody needs a ego boost, this is not the way to do it.

Vushvush
06-23-2004, 03:48 AM
Lightfreeze has essentially nailed this on the head, although in a slightly unclear way.

You guys have been doing CG for way too long, and have dealt with computers so much you seem to only think in those terms. Your brain sin't doing any calculations to determine the trajectory of the ball, and it doesn't understand the world around it in pixels or triangles. Trying to suggest that our brain does incredible mathematical computations to understand the face a a person we know is simply not true.

It is inaccurate to suggest that we see a 432849238468 x 43274732984 pixel image and that our supercomputer of a brain process that image pixel by pixel and whatever refresh rate. I'd actually argue that our brain is not a computer at all! I'd argue that we can't do true computations which is why we use computers, calculators, pen and paper, and to an extent our fingers to count! People who are good at mathematical operations and memory have a brain that can fit that logic into a pattern we can work with. Essentially, what I'm trying to say is, whenever we do a mathematical operation we are actually emulating a computer in a rough way, and thus are not very good at it. To continue that further, a computer is not built like our brains, which are analog devices we specific types of sensors. This is why whenever we ask them to emulate us they fail.

Basically.... computer != human brain :)

colintheys
06-23-2004, 04:42 AM
Trying to suggest that our brain does incredible mathematical computations to understand the face a a person we know is simply not true.Not to be too arguementative, as this is an interesting and fun (if a little OT) discussion, but that's a dangerous statement to make unless you know what you're talking about. :) There are researchers who devote their lives to this question. I'm a potential neuroscience major and have taken two years worth of cognitive science, neural psychology, and behavioral neuroscience at Yale and Wesleyan. I assure you, the brain is not just like a computer, the brain IS a computer! Well, at least, that's my take. Admittedly, the issue is still debated. Accordig to my way of looking at it, a PC, which is a serial, mecanical and electronic implementation of a computer, happens to compare very poorly to the sophisticated instrument that is a brain, but nonetheless, they both fall into the category of computer, or device that manipulates symbols.

Don't take my word for it, though. I encourage you to go grab a book or paper on cognitive science / neurscience. It's fascinating stuff and will really change the way you view the world. Of course, I admit that I'm biased. :) Here's a link to a pretty basic and unbiased overview of the question I found on Google: http://mcdb.colorado.edu/courses/3650/computer/ :) The information is a few years old at this point, but should give a good, quickie introduction to the question.

creative destructions
06-23-2004, 06:08 AM
I have the agree since the founding of computer science was modeled after a human brain. Human brain came first, then computers. Hexidecimal, ASCII characters, and computer languages were introduced so computers can be more readily readable by humans. When you look at it, we are either on or off, seldom indecisive about anything.

Quizboy
06-23-2004, 09:32 AM
i think vush vush and lightwave are headed in the right direction by saying that the brain's calculation speed should not be measured in very computer oriented terms like pixels/second and these kinds of issues. Although I disagree with them that the brain is not a computer. It is a computer, it's just that in attempting to model computers after our own brains we have limited ourselves in our primitive way of trying to make everything linear and logical. It's natural to do because there is no other way of quantifying a computers functioning, but at the same time it's our failing in trying to get a computer to emulate the brain.

An example: We think of pictures in pixels/cm in order to manipulate picture on screen. But the digital photo versus analog photo challenge is proof that this method is apples and oranges. Since 35mm cameras use a chemical process to record an image it doesn't matter how large or small the image is as a determining factor of processing speed, it is simply an instantaneous chemical impression. We try to quantify this in pixels because it is handier for us, but as we see it is not a comparable process because the contrast ratios for digital photography have still not matched that of analog, and you don't need an expensive SLR to get the same quality resolution as a Hasselblad.

I believe the Digital/Analog Challenge in Photography is synonymous to the Digital/Analog Challenge in Computer vs. Brain. Our way of having computers calculate is handy because it is quantifiable, but at the same time it is the very reason why it is nearly impossible to match the brain's functioning. We're just going about it the wrong way. Seeing it wrong.

A computer tells you that an object doesn't fit into a box by measuring it's width and comparing that to the width of the box. Box width is less than object width: object doesn't fit. Brain sees object, Brain sees box, brain sees that object doesn't fit. It's more of an impression than a rigid calculation. the computer could see both objects and without numbers on their measurements it can't do anything. You'd have to first hook up some kind of photgrammetry method to give the computer numbers, and then it would still use the numbers, not an impression of whether it simply SEEMS bigger or not as does the brain. Furthermore, it makes no difference if the object and box are 10cm wide or 89.4566m; it still takes the same "processing power" of seeing the two objects side by side to determine if one fits into the other. With a computer the first calculation is faster because the number is less bits. This is proof that the two systems are not comparable in terms of processing speed. Now expand this to ten objects side by side and a long box, the brain still goes for visualizing ten boxes beside each other and tries to SEE if they are wider than the long box. Right away we see the limitation of the brain because if we throw it the question of one million objects in a super big box, the brain won't be able to handle the question by seeing because the example is so far from any reality which the brain has experienced that it is not able to visualize it and won't be able to answer the question. That's when we switch to calculating the numbers through mathematics like a computer. And even then the answer doesn't mean as much. It just remains a theoretical, whereas you can imagine the emotion behind seeing the ten objects..."No, that won't fit!"

The breakthrough will come when some genius figures out how to properly conceptualize the analog calculation technique of the brain, and then a second genius to translate that concept into practical form.

LightFreeze
06-23-2004, 10:08 AM
my theory (and I`m not an expert) is that a neural net is just a reactive process ,not a computational one because that is how most animals seem to deal with the world.
We all (animals I mean, not just people)have pretty similar brains and this suggests that most of our decisions are just reactions to stimuli although because we have greater abilities the stimuli that we react to is much more complicated.

Vushvush
06-23-2004, 03:35 PM
That is exactly my point. I don't think the brain is built from anything even close to resembling lines of code, or ANY mathematical computation.

When I say say computer != brain I say it because when we talk about computers as we know them we are talking about glorified calculators. The human brain doesn't use IF statements, and as I said, no where in the brain is there a small section that does 1 + 1 = 2, which is clear by the way we teach kids that 2 apples + 2 apples = 4 apples.

Again, Lightfreeze is correct in that we are NOT computational devices.... and thus not computers in the common sense.

colintheys
06-23-2004, 06:07 PM
my theory (and I`m not an expert) is that a neural net is just a reactive process Though I don't agree ;), that is the position taken by John Searle and is a well-respected viewpoint and, honestly, makes good logical sense. The arguement is:

" The argument proceeds by the following thought experiment. Imagine a native English speaker who knows no Chinese locked in a room full of boxes of Chinese symbols (a data base) together with a book of instructions for manipulating the symbols (the program). Imagine that people outside the room send in other Chinese symbols which, unknown to the person in the room, are questions in Chinese (the input). And imagine that by following the instructions in the program the man in the room is able to pass out Chinese symbols which are correct answers to the questions (the output). The program enables the person in the room to pass the Turing Test for understanding Chinese but he does not understand a word of Chinese."
"The point of the argument is this: if the man in the room does not understand Chinese on the basis of implementing the appropriate program for understanding Chinese then neither does any other digital computer solely on that basis because no computer, qua computer, has anything the man does not have."

However, unless you accept some form of dualism, this leaves us, sadly, as nothing but a complex series of automatic reactions, reflexes, like a behaviorist would use to describe behavior. Put input A into brain, recieve outbut B. If you can control the stimuli perfectly, you will always get the same result, according to them.


Brain sees object, Brain sees box, brain sees that object doesn't fit.
That is not a solution to a problem... That's like saying "calculator sees 3. calculator sees 4. calculator sees plus. so calculator sees 7! See, no computation!" Realize that something MUST happen behind the black box description. In reality, we perform a very detailed and precise analysis of the information we have available to determine if the object fits in the box. For instandce, just the process of turning the input from the eyes into representations of 'object' and 'box' is highly mathmatical and highly complex. We use some generalizations and rules to aid us (for instance, we assume that light is generally uniform and comes from above.)

It is inaccurate to suggest that we see a 432849238468 x 43274732984 pixel image Also, realize that the eye is an imaging device that communicates via neurons and thus, definately has a 'resolution' and a 'frame rate.' It so happens that it is one of the best understood parts of the neural system. We can literally intercept the optic nerve and read the signal. There are a finite number of axons, each of which carries the signal from a fixed number of cones and rods at a fixed rate, thus resulting in a fixed number of sample points, transferred at a certain rate. This results in an image (not a very good one, btw.) However, what makes the eye so much more advanced than your digital camera is its additional processing. For instance, each visual field on the eye detects not only individual points, but are processed in small groups to search for sudden changes in intensity. This is actually done by testing series of concentric circles of sensitivity and comparing their brightness. This data is then analyzed to help detect edges and their orientation! Another thing that makes eyes different from what we perceive as digital images, is that the 'dpi' is not constant. The center of the eye is FAR more detailed than the rest. The very center, known as the fovea centralis, is home to the most receptors and thus carries the most resolution. Hence why you look directly at something to see it the best.




BTW, How ever did we get SO off-topic! :)

visualride
06-23-2004, 06:42 PM
Then genetics come into play. It has been shown in many studies that the aptitude of a person to learn is genetic. Newer generations will become enourmously more capable than their predecessors as time wears on.
The question weather or not the aptitude of a person to learn is genetic or not is debatable. The brain is a bit like a muscle. Consistent flexing of the part of the brain to play a piano will create new connections in the brain to deal with the that kind of processing. Of course many people have very strong connections formed in their brains that conflict with these new connections being made, but they can be broken down too with mental reprogramming (hah! now I'm really rambling).
Now if only our computers would start rendering faster after a while after "learning" how to do it. Maybe someday the computer will be rendering out a Cornell box while walking up the stairs and contemplating what to make for dinner.:)

athosghost
06-23-2004, 07:21 PM
Great topic. I'm more inclined to belive that our brains handle nothing more than 1's and 0's as a computer would although on a much more advanced level, after all it's nothing but a collection of snyapsys that are either on or off. How many synapsys do we as human's have in our brain? 4 billion or so? And how many transistors are in your top of the line CPU? 150 million? Sure we can't do the complex computations on the fly but I'm sure our brains could be trained to do so. I recall seeing a "human calculator" on a Discovery Show "More than Human" who claimed that any human can be trained (programed) to do the same.

athosghost
06-23-2004, 07:28 PM
That is exactly my point. I don't think the brain is built from anything even close to resembling lines of code, or ANY mathematical computation.

When I say say computer != brain I say it because when we talk about computers as we know them we are talking about glorified calculators. The human brain doesn't use IF statements, and as I said, no where in the brain is there a small section that does 1 + 1 = 2, which is clear by the way we teach kids that 2 apples + 2 apples = 4 apples.

Again, Lightfreeze is correct in that we are NOT computational devices.... and thus not computers in the common sense.
I heard a report on NPR some time ago about how DNA is actually very much like spagehtti code. A really really badly written one at that. I have to disagree though that we don't use IF statements.

if BeerTemp = warm;
{
yuck;
}
else
{
yum
}

We probably don't have to do that line of code every time we are offered a beer because we nativly remember that we don't like warm beer. It's like second nature.

creative destructions
06-23-2004, 07:43 PM
I heard a report on NPR some time ago about how DNA is actually very much like spagehtti code. A really really badly written one at that. I have to disagree though that we don't use IF statements.

if BeerTemp = warm;
{
yuck;
}
else
{
yum
}

We probably don't have to do that line of code every time we are offered a beer because we nativly remember that we don't like warm beer. It's like second nature.C and C++ were made to closely resemble the english language. It shouldn't be surprising that it mimics our thinking patterns. Otherwise in assembler it would look like this.


cmp d0,d1
bgt YUCK
jmp YUM


Layers of abstraction makes it easier for us to understand computers and mathematics. I think the question would be, "Are we really binary computers, or does quantum computing come into play?"

SOPLAND
06-23-2004, 08:25 PM
Layers of abstraction makes it easier for us to understand computers and mathematics. I think the question would be, "Are we really binary computers, or does quantum computing come into play?"
We are deffinitely binary computers. Our mind is made up of a series of gates or switches that are configured in such a way as to fire synapses in a particular sequence. The difference between the human mind and a computer comes down to how data is handled and how quckly it's handled. Our minds are massivley parrallel and can handle data out of sequence and discard data an imformation that's irrelevant. A computer on the other hand uses a pipeline. It is much easier to understand and handle data on a concious level if there is a piepline and it is handled in sequence. That's why we have designed computers the way they are, so we can understand them easily. We don't have enough of a grasp of the human mind to easily understand it on a concious level, much less design a computer that operates similarly. Eventually we'll probably design a computer that emulates the way the human mind works pretty well, but it's going to have to be an order of magnitude more powerful than the Earth simulator to deal with all the overhead.

Vushvush
06-23-2004, 10:49 PM
ColinTheys, I understand that we need to disagree in order to have a fun discussion, but don't treat me as if I don't know how the eye works :)

When I say we don't see pixels I'm talking about the physical makeup of an image in our brain, but rather that we do not go an evaluate pixel 1,1 then 1,2 ......... 473298432,73210984 and so on. This is how we differentiant from a computer, in seeing, and in all other aspects of being. This is also the reason you'll never get the exact same reaction from a person. The randomness generated from the way we process information sees to that clearly.

LightFreeze
06-24-2004, 12:52 AM
Colintheys is there such a thing as a neural net of neural nets?

I`m in the neural net camp and feel people only compare ourselves to computers or binary systems because there is nothing else that we have developed that comes close to a brains abilities but just because computers are the first things we have developed with similar abilities to ourselves doesn`t mean they operate in a similar way to ourselves

Computers are now able to simulate lots of different processes like weather systems or racing car engines but that doesnt mean everything works like a computer works

Vushvush
06-24-2004, 01:47 AM
Lightfreeze, the thing you said is very true, when refering to the computer's action as a simulation. even if and when we develop a type of decent AI, it will still be merely a simulation. The computer as we know it at the moment (1s and 0s) will never be "just like a human brain" anymore than a computer will be a tornado (so to speak).

colintheys
06-24-2004, 04:51 AM
ColinTheys, I understand that we need to disagree in order to have a fun discussion, but don't treat me as if I don't know how the eye works :)

When I say we don't see pixels I'm talking about the physical makeup of an image in our brain, but rather that we do not go an evaluate pixel 1,1 then 1,2


Ah! Sorry for that, I misunderstood you. Not intended to be condescending. :) Sorry if it came across as such. Here, I agree with you! That is the fundamental shorcoming of our current computers and image analysis techniques. The brain takes the same imput and processes it in a very different, very parallel way. The only place here where I would argue would be that I don't consider the differing element to be random, but rather a consequence of differences between individuals and differences in the stored memories and developed habbits.


Colintheys is there such a thing as a neural net of neural nets?

I`m in the neural net camp and feel people only compare ourselves to computers or binary systems because there is nothing else that we have developed that comes close to a brains abilities but just because computers are the first things we have developed with similar abilities to ourselves doesn`t mean they operate in a similar way to ourselves
I have always found neural nets fascinating and I imagine that there can certainly be a net of neural nets and that that would most closely approximate the operation of the brain, far more so than current computers! This is a very good point and I admit you're probably right. However, I was under the impression that a neural network is indeed a form of computer? When I say the brain is a computer, I don't mean that it is a serial, cyclic, calculating device like a PC. I mean that it is a device that operates by manipulating symbols (of which numbers are a subset.) However, if you're at a neural net camp, you probably know more them than me me, so please correct me if I'm wrong!

I don't pretend that our current implementation of the idea of a "computer" is anywhere near as sophisticated as the implementation that is the brain. It probably compares no better than the first tribe of hunters compares to the United States. While both can be qualified as a 'society,' one is VERY different than the other.

fasteez
07-03-2004, 03:36 PM
while learning ada at university , got bored , so speak about that with a friend...
i tried to do two things at once .. but its quite impossible to speak to him while writing something else fluently (he told me the sentences just before so i can memorize it , to make it easier too ) ...
it was really sequential thing even if i tried to hide it ( i really thought we are able to do multiple things at one time )
i spoke some words take a breath and i write while breathing coz breathing dont need concentration...

in fact we can do animal things in a parallel way: walking , breathing etc coz its in a special part of the brain ( that is the same a our monkeys ancestor , i read that in a book ) that part manage the life critical and animal things of us (reminds me of os kernel ^^ ) ... but the outter part of the brain , which make us "intelligent" (lol not every time) , manage more complicated things that are difficult to parallelize .. you have to learn it before ( like playing piano or drums that require a lot of coordination u know )

hmm to finish i think that speed is not important , the real force of the brain is that its living ... we can organize , learn , improve... a computer is only static ( for the moment )

edit : to match about neural nets , our brain creates new neural connection every time , thats the learning process i think ( only my supposition ) .. actually no machine or electronics is able to modify itself or add some new parts to itself ( no grow )

allseeingi
07-04-2004, 04:12 AM
This is probably the most interesting thread I've seen on this board since I've been here!

Anyway, I don't know what the importance is of this fact: Computers are entirely logical. A particular input will always generate exactly the same output because we have built computers internal communications systems to be as predictable as possible (e.g. under normal circumstances, rarely will a transistor perform the wrong operation). But humans utilize chemical reactions throughout the communication systems. Chemical reactions as we know are never precisely predictable. Not only this but you have the chemical reactions in one communication system (e.g. the nervous system) affecting the operation of other communication systems (e.g. the endocrine system) and vice versa in entirely unpredictable ways. Surely this fundemental difference in the communication of information goes someway to explaining why there is such a difference in the way the brain operates as opposed to a computer?

- allseeingi

jcbray
07-04-2004, 05:54 AM
I don't think it's right to say that computers act in a logical manner and brains don't. We do, just we interpret lots of different things.

For whatever exaple you want, say someone's angry at you, you use logic to try and figure out why - have you done something, what did you do, why did it affect them so much, what was their mood at the time, how angry are they, were they serious of joking, etc you process hundreds of different things and come up with the final answer (He didn't like getting punched ;))

Human's are also quite predictable - psycology wouldn't be much use if we weren't would it? It's often more of an inability to understand what all the factors are, and the level of importance placed on it by each person...

parallax
07-04-2004, 01:37 PM
It is inaccurate to suggest that we see a 432849238468 x 43274732984 pixel image and that our supercomputer of a brain process that image pixel by pixel and whatever refresh rate. I'd actually argue that our brain is not a computer at all!

Thats where you are wrong.
The fact that artificial eyes are working real-world devices proves that it IS quite the computational juggernaut. If we are able to capture footage through electronic devices, process the data and connect it to the visual cortex, it means the brain is comparable to a logic device.

It is a fact, because it works. Just like the H-bomb proves Einstein right.

But to get back to the discussion, the human brain simply doesn't have the input and routines to process exact discreet data, simply because it doesn't need it. Its a balanced analog system wich needs to 'process' its action these way, because a full discreet system will not work in an analog world.

colintheys
07-04-2004, 03:01 PM
The statement has been made several times that the brain is an analog device. I would like to point out that it is not. Analog systems produce smooth, "continuously variable" output, while digital systems represent data with "discrete symbols from a finite set" (often binary.)

It is incorrect to suggest that the brain is analog. Each neuron has a long appendage called an axon and a receiver called a dendrite. Axons connect to the dendrites of other neurons, and neurons communicate by sending pulses down their axons to the dendrites of other neurons. However, it is not the intensity, duration, etc. of these pulses that carries the data, but rather their mere presence or absence. A neuron is triggered to fire when the rate of pulses hitting it passes a certain level.

If the brain were an analog device, it would have to monitor the incoming signals for continuously variable intensity, thus making it much more susceptible to noise. However, a neuron does not do this. It either reads it's input as on or off. Pulse or no pulse. Not some pulse. So a weak incoming signal of 20% normal intensity is simply ignored as noise. This is a digital system. It's not a packetized digital system like we often deal with in computers and it's not a digital system with a fixed clock cycle. It's a digital system where data is transmitted via pulse rate, sort of like morse code (only not really). J

Vushvush
07-04-2004, 08:55 PM
Thats where you are wrong.
The fact that artificial eyes are working real-world devices proves that it IS quite the computational juggernaut. If we are able to capture footage through electronic devices, process the data and connect it to the visual cortex, it means the brain is comparable to a logic device.




The computation is still done in the artificial device. and if it's just a method of sending the image to the brain, well, you simply replaced the eye structure, not the way we precieve it. Either way, this doesn't reflect in any way on the discussion of how we compute imagery internally .....

CGTalk Moderation
01-14-2006, 12:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.