PDA

View Full Version : Update on (SONY, IBM)CELL Processor


RobertoOrtiz
02-08-2005, 02:26 PM
Quote:
"Setting up a battle for the future of computing, engineers from IBM, Sony and Toshiba unveiled details Monday of a microprocessor they claim has the muscle of a supercomputer and can power everything from video game consoles to business "

>>Link<< (http://story.news.yahoo.com/news?tmpl=story&ncid=738&e=1&u=/ap/20050208/ap_on_hi_te/cell_processor)

-R

dmonk
02-08-2005, 03:05 PM
I just hope the price tag on these are reasonable when they hit the market. Tons of possibilities. If it lives up to all of the hype, I think it will be a good shake up to the industry.

Coliba
02-08-2005, 03:32 PM
" And IBM Corp. has said it will sell a workstation with the chip starting later this year."

Yum, yum......

eks
02-08-2005, 09:21 PM
Cell Architecture Explained (http://www.blachford.info/computer/Cells/Cell0.html)

Thalaxis
02-09-2005, 02:13 PM
I just hope the price tag on these are reasonable when they hit the market. Tons of possibilities. If it lives up to all of the hype, I think it will be a good shake up to the industry.

I wouldn't worry on that front, at least for the console; Sony has been willing to accept a the console
itself as a loss-leader in the past, in order to maintain a price point that console buyers will tolerate.
They don't have a choice... price it reasonably, or kiss their $400 million in R&D + whatever part of
the fab bill they're footing + whatever they paid nVidia for their contribution goodbye, and completely
kill off one of their biggest cash cows... you figure out which one they'll pick :)

In the end, most of the people who buy it won't care how powerful it is, they will only be concerned
with what cool new games it will let them play, and whether or not they can afford the console and
some games to enjoy it with.

AirbORn
02-10-2005, 04:29 AM
Looks like Moore's law is still at full life support!! :) So when are 3D chips going to be released... lol.... Imagine today's processing power to the power of 10. If anyone isn't familiar with Moore's law you should read up on it. Based on this law which started in the 60's, by the year 2020, processing speed of chips will exceed the speed of which the human brain processes. Scary? Just imagine how fast chips will be by 2030..... according to Moore's Law.... a single chip will be "vastly" faster then all human brains on the planet. Think it is a bunch of bull????..... the theory hasn't failed since 1960's and there is only 15 years left :)

Hehe, since I am on a role, I recommend you guys read "The Age or Spiritual Machines" :)

js33
02-10-2005, 07:52 AM
Moore's law only stated that the number of transitors on a chip would double every 18 months. He never said anything about speed although speed does parallel the amount of transitors to some extent. Also current manufacturing processes are hitting the wall so the only way forward is dual or more multicore chips. Of course there is the Cell and the crossbar latch that was annouced by HP lately. It will take breakthroughs like this to keep moving forward.

Cheers,
JS

playmesumch00ns
02-10-2005, 11:01 AM
Looks like Moore's law is still at full life support!! :) So when are 3D chips going to be released... lol.... Imagine today's processing power to the power of 10. If anyone isn't familiar with Moore's law you should read up on it. Based on this law which started in the 60's, by the year 2020, processing speed of chips will exceed the speed of which the human brain processes. Scary? Just imagine how fast chips will be by 2030..... according to Moore's Law.... a single chip will be "vastly" faster then all human brains on the planet. Think it is a bunch of bull????..... the theory hasn't failed since 1960's and there is only 15 years left :)

Hehe, since I am on a role, I recommend you guys read "The Age or Spiritual Machines" :)

Moore's law in its literal sense is already dead: as described in more detail in another thread, transistors have got so small that quantum tunnelling "leaks" data out of the circuits. Manufacturers can't pack more transistors onto a chip without raising the power to combat leakage, but that means you need increased cooling to stop the chip from melting (not to mention what it'll do to the environment).

Hence manufacturers are moving to parallel processing (HT/MultiCore/Cell) or trying to replace the transistor (HP's crossbar latch thingumy).

Thalaxis
02-10-2005, 01:33 PM
Moore's law in its literal sense is already dead: as described in more detail in another thread, transistors have got so small that quantum tunnelling "leaks" data out of the circuits. Manufacturers can't pack more transistors onto a chip without raising the power to combat leakage, but that means you need increased cooling to stop the chip from melting (not to mention what it'll do to the environment).

Hence manufacturers are moving to parallel processing (HT/MultiCore/Cell) or trying to replace the transistor (HP's crossbar latch thingumy).

Which is only possible because of the blindlingly obvious fact that Moore's Law is still very much alive.

It is in fact on track through the 45 nm process node, which puts its potential demise at minimally
5 years out, since no one's even using 65 nm process technology in production yet.

mummey
02-10-2005, 02:18 PM
Which is only possible because of the blindlingly obvious fact that Moore's Law is still very much alive.

It is in fact on track through the 45 nm process node, which puts its potential demise at minimally
5 years out, since no one's even using 65 nm process technology in production yet.

no one's using the 65 nm process technology in production because they are having too many problems with it. While maybe in theory Moore's law may still apply. Moore's law was about the practicality of processor speed increasing. This has already failed.

Even when the multi-core chips begin to come out, people will have to wait for the software to take advantage of it the same way they are waiting for 64-bit.

Thalaxis
02-10-2005, 02:28 PM
no one's using the 65 nm process technology in production because they are having too many problems with it.


Did it not occur to you that they might not be using it because it's new? Nearly every process has had
problems before companies started using it in production, so that's hardly unique to the 65nm node. It's
scheduled to go into production in about a year from Intel, and about 6 months later from IBM and AMD.
Considering how recently all three of those companies put 90nm process technology into production,
that's rather impressive turnaround.


While maybe in theory Moore's law may still apply. Moore's law was about the practicality of processor speed increasing. This has already failed.


No matter how you try to play around with words and technicalities, you're still wrong. Transistor budgets
are still increasing, just like Moore predicted that they would. I doubt that he anticipated how far it
would go, though; it's hard to believe that anyone back then might have been imagining that a
processor with 1.75 billion transistors might be feasibly produced in volume in 2005.


Even when the multi-core chips begin to come out, people will have to wait for the software to take advantage of it the same way they are waiting for 64-bit.

That's hardly a new phenomenon.

mummey
02-10-2005, 02:43 PM
No matter how you try to play around with words and technicalities, you're still wrong. Transistor budgets
are still increasing, just like Moore predicted that they would. I doubt that he anticipated how far it
would go, though; it's hard to believe that anyone back then might have been imagining that a
processor with 1.75 billion transistors might be feasibly produced in volume in 2005.

Oh cut it out with the "Moore's Law" fanboy crap you're spewing. Moore wrote a research paper in 1965 an made an estimate for ten years after that point. Anything that occured after that point was on pure luck because nothing in that paper states the methods they would have to use to continue to increase the transistor number exponentially for so long.

Moore's Law is a coincidence. Quit making it sound almost 'divine'.


That's hardly a new phenomenon.

Maybe not to you or me in academia, but most people out there may be in for a little surprise when they buy these new systems with multi-core cpu's and find that most their apps will not run any faster and may even run slightly slower.

Thalaxis
02-10-2005, 02:50 PM
Oh cut it out with the "Moore's Law" fanboy crap you're spewing. Moore wrote a research paper in 1965 an made an estimate for ten years after that point. Anything that occured after that point was on pure luck because nothing in that paper states the methods they would have to use to continue to increase the transistor number exponentially for so long.


Fanboy crap? Stop putting words in my mouth.


Moore's Law is a coincidence. Quit making it sound almost 'divine'.


Ah, so you just can't admit that you were wrong. I'm not trying to make it sound like anything other
than what it was, but unlike you I'm just referring to what he actually said, rather than inventing
some excuse to claim that it's dead.


Maybe not to you or me in academia, but most people out there may be in for a little surprise when they buy these new systems with multi-core cpu's and find that most their apps will not run any faster and may even run slightly slower.

That's also hardly a new phenomenon in the computer industry. Remember the marketing behind
MMX and altivec? Remember the P4's introduction?

Besides, the multi-core processors won't have the same clock speed as their single-core brethren,
so the uneducated won't expect them to be as fast anyway -- most of them haven't figured out the
relationship between architecture, clockspeed, and performane anyway.

mummey
02-10-2005, 02:57 PM
I guess to you it doesn't matter what I say. "I'm wrong" Your last three posts say so. ;)

AirbORn
02-10-2005, 03:54 PM
Perhaps I should do a scan from the book I have that talks about why moore's law isn't as dead as everyone thinks it is.

The reason why I believe it isn't dead is because computing speed is constently increaseing, and like Thalaxis stated, companies run into problems with new technology for processors. And even with these problems, the chip is still able to compute at a higher speed, but obviously isn't ready for distribution. I mean, please correct me if I am wrong, but moore's law applies more to the computing speed during production of a chip (in a lab), not to the speed of a chip that is distributed to a market at that time. That is my 2 cents...

Thalaxis
02-10-2005, 04:22 PM
I guess to you it doesn't matter what I say. "I'm wrong" Your last three posts say so. ;)

Well, if you actually READ what Moore said, then it's clear that you were entirely wrong, because he
didn't say anything directly about performance. That his prediction lasted far longer than anyone
thought it would may have been fortuitous, but it doesn't change the fact that it's true. And so far,
even IBM's ailing semiconductor division is showing that his law (which it's true could be better
characterized as a rule of thumb) is still alive.

Thalaxis
02-10-2005, 04:31 PM
Perhaps I should do a scan from the book I have that talks about why moore's law isn't as dead as everyone thinks it is.


It's related to the fact that he didn't make any predictions about clock speeds, which are starting to
come to a thermal limit.


The reason why I believe it isn't dead is because computing speed is constently increaseing, and like Thalaxis stated, companies run into problems with new technology for processors. And even with these problems, the chip is still able to compute at a higher speed, but obviously isn't ready for distribution. I mean, please correct me if I am wrong, but moore's law applies more to the computing speed during production of a chip (in a lab), not to the speed of a chip that is distributed to a market at that time. That is my 2 cents...

What he referred to is more general than that, and doesn't even necessarily correlate to performance.
The embedded community has been enjoying the benefits of increased transistor densisites for years,
because it allows them to lower production costs and shove more processors into the distribution
channels. In a lot of those markets, no one cares two bits about computing power, but they do care
about power consumption and die size.

The only reason that Moore's Law became associated with performance is that he happened to be
Intel's founder, and Intel's main product line is x86, and one of x86's primary features has become
performance. It got associated with clock speeds because of the P4 marketing machine.

One thing I think is ironic about the Cell hype is that the host processor actually has far more in
common with the P4 than with any other processor on the market; it's a narrow-issue speed racer.
:)

Wintermute
02-10-2005, 05:02 PM
The cell actually has more in common with the 80 MHz PPC 601 processor in my old Power Mac 7100 (in terms of how the SPEs are set up).
Ars article (part 1) (http://arstechnica.com/articles/paedia/cpu/cell-1.ars)

IBM/SLI et al must be pretty confident in moving to 65nm process, considering how difficult moving to 90nm was.

:wip:

Thalaxis
02-10-2005, 09:48 PM
The cell actually has more in common with the 80 MHz PPC 601 processor in my old Power Mac 7100 (in terms of how the SPEs are set up).
Ars article (part 1) (http://arstechnica.com/articles/paedia/cpu/cell-1.ars)

IBM/SLI et al must be pretty confident in moving to 65nm process, considering how difficult moving to 90nm was.

:wip:

It's based on a research project that they did a few years ago. It's a 2-issue, in-order processor with
simultaneous multithreading. Very simple, making up for very low IPC with high clock speed. That's
for the "main" processor, that is, not the SPE's.

Either confident or optimistic... but they might also be planning to implement AMD's automation
technology by then, which could make quite a difference. Also, IBM isn't working on the 65 nm
fab tech alone, they're working with almost everyone in the semiconductor industry that isn't Intel.

Cyberdigitus
02-11-2005, 04:30 PM
by 2030..... according to Moore's Law.... a single chip will be "vastly" faster then all human brains on the planet.

I recommend you guys read "The Age or Spiritual Machines" :)

Hmm, it's not only technology that is evolving you know, humanity itself is evolving too. We can't even measure or define how 'fast' the 'brain' works anyway. I believe thoughts are 'processed' at 'the speed of light', and are instantly available to the whole of reality. No computer that equals that... unless the computer taps into the very fabric of Nature itself, but we already do that with our species being multidimensional.

But some day we will see such Chrystal Light Technology... to be used along Inner Technology. advanced technology is only available equally to the spiritual advancement, that's why we are seeing such technological advancement in the first place, it's a circular thing.

May sound a bit New Age and all, but allaz, science is quickly discovering things about us and our 'surrounding' reality that will surprise us. I recommend to read about 'the age of Spiritual Humanity' ;)

Thalaxis
02-11-2005, 04:42 PM
Hmm, it's not only technology that is evolving you know, humanity itself is evolving too. We can't even measure or define how 'fast' the brain works anyway. I believe thoughts are 'processed' at the speed of light,


Dude, nerves are SLOW. The power of the brain has nothing to do with that. It comes in large part from
staggeringly massive parallism.

Another part comes from adaptability, which is to say the ability to create new connections between
nerves.

Yet another is the staggering number of connections neurons can make between one another.

How all of that stuff works together is the big unknown. We have a very good understanding of how
nerves transmit signals along their lengths and to each other, but what they do with those signals is
still the subject of considerable research. I don't think we'll get any sort of handle on that until the
computer science guys and the neuroscience guys finally figure out that each knows something
that the other doesn't and start pooling their resources. You'd be surprised at how many people
in the physical sciences think that they know how to program computers... and don't get me
started on programmers.

slaughters
02-11-2005, 06:22 PM
I don't care that "The Cell" is getting controversial reviews. I liked the special effects and think Jennifer Lopez did an OK job in it.

(Sorry :), but this has been floating around in the back of my head since I saw the first "The Cell" thread last week)

---- Edited In -------

... I believe thoughts are 'processed' at 'the speed of light'...Thoughts are a biochemical reaction (receptors, dendrites, blah, blah, blah, etc..). Last I checked chemical reactions are considerably slower than lightspeed :)

BUT - The density of the information transfer is considerably vaster than the simple binary system used by computers. Chemical reactions can have millions of different "states" posible in each simple transfer VS the 2 different states available (0 or 1) in the binary representations used by a computer.

Apoclypse
02-12-2005, 05:21 AM
whoa, this istuff is too deep for me man.

mustique
02-12-2005, 12:41 PM
Davidaleon on 3dBuzz posted this in-depth review link of the Cell architecture.

http://www.realworldtech.com/page.cfm?ArticleID=RWT021005084318

rendermania
02-13-2005, 11:11 PM
What is this I read about the Cell delivering 256 Gflops @ 4Ghz? And what software or OS, precisely, is this IBM workstation supposed to run?

Thalaxis
02-14-2005, 03:05 PM
What is this I read about the Cell delivering 256 Gflops @ 4Ghz?


That's the theoretical peak for 32-bit floating point throughput when using the SIMD engines + the
PPC processor maximally.


And what software or OS, precisely, is this IBM workstation supposed to run?

That they haven't discussed yet, AFAIK. Given that IBM plans to launch workstations using it, I'd
guess that they'll use Linux, but that's just a theory based on IBM's massive backing of Linux.

It may also make its way into add-on cards, which would be a clever application as well,
particularly since the existing implementation is very constrained in terms of what systems
you can put it in and how much memory you can put in the system. There will probably be
future variants of it that aren't quite so tightly bound to the gaming console's needs, but
they haven't discussed that sort of thing publicly yet, either.

rendermania
02-14-2005, 04:11 PM
A Cell processor based hardware rendering card would be an interesting idea for sure.

CGTalk Moderation
02-14-2006, 05:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.