PDA

View Full Version : The End of GPU and DirectX, Interview w. Tim Sweeney


Solothores
09-15-2008, 09:49 AM
"Inflection point" is a much abused word these days, but if it's appropriate anywhere, then it's appropriate for describing the moment in the history of computing that we're rapidly approaching. It's a moment in which the shift to many-core hardware and multithreaded programming has quite literally broken previous paradigms for understanding the relationship between hardware and software, and the industry hasn't yet sorted out which new paradigms will replace the old ones.

Importantly, the entire computing industry won't pass through this inflection point all at once; it will happen at different times in different markets, as Moore's Law increases core and thread counts for different classes of processors. The first type of device to pass this inflection point will be the GPU, as it goes from being a relatively specialized, function-specific coprocessor to a much more generally programmable, data-parallel device. When the GPU has fully made that shift, game developers will have the opportunity to rethink real-time 3D rendering from the ground up.

For Tim Sweeney, co-founder of Epic Games and the brains behind every iteration of the widely licensed Unreal series of 3D game engines, this inflection point has been a long time coming. Back when Unreal 1 was still in stores and the 3Dfx Voodoo owned the nascent discrete 3D graphics market, Sweeney was giving interviews in which he predicted that rendering would eventually return to the CPU. Take a 1999 interview with Gamespy (http://archive.gamespy.com/legacy/interviews/sweeney.shtm), for instance, in which he lays out the future timeline for the development of 3D game rendering that has turned out to be remarkably prescient in hindsight:

2006-7: CPU's become so fast and powerful that 3D hardware will be only marginally beneficial for rendering, relative to the limits of the human visual system, therefore 3D chips will likely be deemed a waste of silicon (and more expensive bus plumbing), so the world will transition back to software-driven rendering. And, at this point, there will be a new renaissance in non-traditional architectures such as voxel rendering and REYES-style microfacets, enabled by the generality of CPU's driving the rendering process. If this is a case, then the 3D hardware revolution sparked by 3dfx in 1997 will prove to only be a 10-year hiatus from the natural evolution of CPU-driven rendering.

Sweeney was off by at least two years, but otherwise it appears more and more likely that he'll turn out to be correct about the eventual return of software rendering and the death of the GPU as a fixed-function coprocessor. Intel's forthcoming Larrabee product will be sold as a discrete GPU, but it is essentially a many-core processor, and there's little doubt that forthcoming Larrabee competitors from NVIDIA and ATI will be similarly programmable, even if their individual cores are simpler and more specialized.

At NVIDIA's recent NVISION conference, Sweeney sat down with me for a wide-ranging conversation about the rise and impending fall of the fixed-function GPU, a fall that he maintains will also sound the death knell for graphics APIs like Microsoft's DirectX and the venerable, SGI-authored OpenGL. Game engine writers will, Sweeney explains, be faced with a C compiler, a blank text editor, and a stifling array of possibilities for bending a new generation of general-purpose, data-parallel hardware toward the task of putting pixels on a screen.



[...]








Interview starts over here (http://arstechnica.com/articles/paedia/gpu-sweeney-interview.ars)

DDS
09-15-2008, 09:58 AM
that was hella good read. Thanks for sharing man. I'm a big admiror of Sweeney and good to hear something is radically gonna change bcause as human being...I like changes and if they are for good, much better.

Kabab
09-15-2008, 10:33 AM
My prediction,

Middleware is going to become massively more critical in game production..

Cheap freebie engines will start to die off as it becomes far more difficult to write.

3D app's should be far more cross platform compatible :)

P_T
09-15-2008, 10:35 AM
Here's another tech that might cause the end of GPU.

http://www.tkarena.com/Articles/tabid/59/ctl/ArticleView/mid/382/articleId/38/Death-of-the-GPU-as-we-Know-It.aspx

Amazingly, a little-known Australian company, Unlimited Detail, claim to have found a way to render voxels in software, without the need to use a hardware graphics accelerator.

There are some screenies on the following pages.

sonn
09-15-2008, 03:42 PM
Here's another tech that might cause the end of GPU.

http://www.tkarena.com/Articles/tabid/59/ctl/ArticleView/mid/382/articleId/38/Death-of-the-GPU-as-we-Know-It.aspx



There are some screenies on the following pages.

I remember voxels.. I used to love playing the original delta force..

P_T
09-15-2008, 04:04 PM
I remember voxels.. I used to love playing the original delta force..So did I. Now go and at least have a look at a couple of pages of the article.

arneltapia
09-15-2008, 04:16 PM
So did I. Now go and at least have a look at a couple of pages of the article.

What a very nice article. Thanks for sharing. :thumbsup:

fuss
09-15-2008, 06:28 PM
Gosh, I hope this guy is right!

What got me into programming in the first place back in the early 90s was the demoscene (http://en.wikipedia.org/wiki/Demoscene). Programming the hardware directly and pushing it to and beyond accepted limits, exploiting it to achieve things not even the hardware manufacturers thought were possible... Pushing your own knowledge and creativity to achieve effects nobody thought of before... That was the best time of my life as far as programming goes, the perfect mix of technology and art, science and creativity. That was Freedom with a big F and it was taken away from us by fixed rendering pipelines of the first 3D cards... With programmable shaders we got some of the freedom back, but we're still confined by too many things, mainly by the hardware itself. I also hope that when Sweeney is right it will herald the renaissance of Assembler programming. Then we're back to the early 90s - anything goes, baby! :D

heavyness
09-15-2008, 06:47 PM
My prediction,

Middleware is going to become massively more critical in game production..

Cheap freebie engines will start to die off as it becomes far more difficult to write.

3D app's should be far more cross platform compatible :)

3D apps will become the middleware. i can't wait until i can save a Max or a Maya file as a .unreal file and fire up UT3 to test out the map. no middleware, no extra apps...

streamline the process = faster development = less money.

fuss
09-15-2008, 06:57 PM
Cheap freebie engines will start to die off as it becomes far more difficult to write.


Difficulty of doing something never stopped determined people from doing it (in fact, some are spurred by difficulties to try even harder, I know I'm one of those people). If it can be done, it will be done, so much we learned from the past :). And it's the age of internet, there always will be enough people to gather together (or maybe even individuals) to produce at least a couple of good free engines. Also, "difficulty" is a highly subjective thing, mainly based on your own knowledge and experience, so things difficult to some are bread and butter to others... If at all, I predict to see more engines based on different paradigms, even if the high quality ones may become fewer. But who cares, how many different engines do you need? 2 or 3 quality ones to keep friendly competition going are enough, more than that and it's just confusing for people who have to make a choice of selecting the right one for their project ;).

Difficulty of development - if at all - will probably only weed out the lesser ones, those you would not consider using in the first place... Just my prediction.

Anyway, I'm looking forward to the next couple of years! :D

Joss
09-15-2008, 07:58 PM
And you know the job market for 3D/2D Artists will bloom from all the hi-res assets to be created for this. Can't wait to see some more demos of this. I'm sure Crytek will have a re-do of Crysis Island. woot!

mustique
09-15-2008, 08:22 PM
Jesus, that almost sounded like a Larrabee press release.

I do respect the creators of Unreal,

but I don't agree on anything stated in that article,


The world doesn't consist of 3D guys - gamers - video and photographers.

And The CPU has to address many many other tasks for many many markets

with different needs.


Going back to the CPU means going back to a multithreaded X86 architecture.

Multicore CPU's are around for years, yet very few game devs are attracted to write

multithreaded code for game engines. Why?


Because it's damn hard. You'd need much more time to pull out a game.

And your competition would leave you in the dust with proven tools and hardware.

Wich is GPU power + CUDA, OPEN CL or AMD's stream computing like API's.

Vashner
09-15-2008, 09:00 PM
Larry Ellison tried a few times to kill the pc. Calling it dead.

GPU is not gonna go away anytime soon. It's good thinking but the flaw is that
GPU's are already more advanced than the coders and content creators.

What's broken is the creation of art with GPU part. Not the GPU's themself.

They also said DirectX would never make it.

davius
09-15-2008, 10:03 PM
Jesus, that almost sounded like a Larrabee press release.
heheeee! True!
As cool as it may sound, I'm really skeptical about all this fuzz about Larrabee - Intel is poising as a visionary/prophet/leader of the next visual computing era, although it sucked big time in this department for the last 15 years or so. Or Intel has some MAJOR innovations behind its curtains that NVidia/ATI/AMD were too blind not to see before (which I doubt) or it will just launch larrabee with great trumpets and lame performance.

Hey! Not everything is lost! I believe Larrabee will be a greater push toward GPU rendering for us, 3D guys, than it will be delivering greater gaming experience. But who knows for sure? Gotta wait 'n see what happens....

As a side note... O' quantum processor, where are you?

erilaz
09-15-2008, 11:22 PM
I would have thought, if anything, that the GPU is going to be a far more significant factor in graphics processing than ever before, given the recent technologies that take the load off the CPU and do it far more efficiently in the GPU.

Interesting article nonetheless! Gives you something to think about. :)

kiaran
09-16-2008, 12:42 AM
CUDA is seriously awesome. I'm just getting familiar with it and finished the first few exercises that come with the SDK.

Being able to run general C-code on a GPU with hundreds of threads is going to change things up considerably. Consumers probably won't notice anything soon, but in a few years this is really going to start making a splash.

erilaz- Your avatar is a tea-cup in silhouette. Mine is a teapot in silhouette. Weird ;P

Shletten
09-16-2008, 12:50 AM
I can't predict nor imagine the outcome of these things but whatever happens, I just hope our video game experience will be tremendously improved and that it remains retro-compatible.

The whole buzz about rasterization vs ray tracing/GPU vs CPU is kind of confusing for me and I can't choose my camp or what I should expect out of it. But I am being excited by the sparse voxel octree stuff, if it allows greater amount of polygons I am sold! :shrug:

Still I am just a young teenager that doesn't have a clue what he's talking about but at least I am more informed than many.

BitsAndBytes
09-16-2008, 01:10 AM
Certainly interesting. Of the current 3D applications, ZBrush is pretty much the only one (afaik) using software rendering as opposed to GPU hardware acceleration, and it's definately no slouch which is something that speaks in favour of this speculation. Currently your main CPU(s) certainly has more horsepower than that of your GPU, but the GPU is tailored for one task only which is to render graphics as fast as possible and this gives it a great bang for the buck despite it's by comparison meager cpu speed. However programming the GPU is of course less flexible than writing your own software renderer which unlike the GPU, doesn't have to offer a general solution but can instead be tailored exactly for what you need. Now, to efficiently exploit this possibility you will have have the know-how of how to write this perfectly-tailored-renderer and that is obviously ALOT harder than building your engine around OpenGL or DirectX hardware accelerated functionality wrapped up in a nice api. I doubt that the industry will forego the relative ease of development through OpenGL, DirectX in favour of flexible and 'tailored for the task at hand' faster software rendering. But time will tell.

ambient-whisper
09-16-2008, 01:29 AM
CUDA is seriously awesome. I'm just getting familiar with it and finished the first few exercises that come with the SDK.

Being able to run general C-code on a GPU with hundreds of threads is going to change things up considerably. Consumers probably won't notice anything soon, but in a few years this is really going to start making a splash.

erilaz- Your avatar is a tea-cup in silhouette. Mine is a teapot in silhouette. Weird ;P

if this is anything to go by, then hell yes!

http://www.xnormal.net/2.aspx
less waiting time for everyone.

richcz3
09-16-2008, 05:44 AM
I remember voxels.. I used to love playing the original delta force..If I remember Comanche: Maximum Overkill (http://www.youtube.com/watch?v=51E_G7NCXVM) (helicopter flight sim in DOS 1992) was voxel terrain.

ambient-whisper
09-16-2008, 06:13 AM
man 1992... im surprised how good that looks for 1992. you sure its 1992?

voxels, reflecting water, multiple cameras, crappy music:D, and landscapes that would still rival games from a few years back... ( then again, most games still use very primitive tech in terms of large scale terrain. luckly id will change that with rage.

DuttyFoot
09-16-2008, 04:49 PM
hey rich thanks for that, i use to love commanche, i remember that exact mission where you had to blow up those oil tankers. one of my faves was the night mission where you had to take out some tanks.

CHRiTTeR
09-16-2008, 06:03 PM
man 1992... im surprised how good that looks for 1992. you sure its 1992?

voxels, reflecting water, multiple cameras, crappy music:D, and landscapes that would still rival games from a few years back... ( then again, most games still use very primitive tech in terms of large scale terrain. luckly id will change that with rage.

camanche 2 was released in 1995 so it could be comanche 1 was around 1992.

Also check out the game "outcast" (by appeal), wich looked pretty amazing for its time (1999), all using voxels (no hardware accel.), it even had antialiassing, bumpmapping, reflections, depth of field...

http://en.wikipedia.org/wiki/Outcast_(game)

richcz3
09-16-2008, 06:07 PM
Yeah Nova Logic really had the Voxel technology going on in their games and it did look good.
Needed a really fast 386 to get decent frame rates @ 640x480 :surprised. Interesting enough the 3D accelerator market was still a ways off. Even with that - polygon acceleration won out.

I can't do a proper search, but if I remember reading the Nova Logic manual correctly (way back when) the Voxel technology had its roots in medical technologies.

As for the end of the GPU. There are industries with strong players heavily invested around the GPU. I don't expect them to roll over and die anytime soon. It goes without saying though, voxels (CPU) and polys (GPU) have and can coexist. No reason for one to undermine the other.

CHRiTTeR
09-16-2008, 06:14 PM
I do think the GPU will be combined with the CPU (isnt this the case with the larrabee tthing?)

fuss
09-16-2008, 06:52 PM
Wasn't the big limitation of voxel rendering (at least in those aformentioned games) the fact that you could rotate only around 2 axes? Or does my memory fail me/the problem has been solved and is a non issue by now? Other than that, I don't see how voxel rendering offers any significant advantages over polygon rendering with per-pixel displacement mapping... Unless we're venturing into the volumetric rendering area. However, I have been almost completely out of the loop regarding graphics programming in the past couple of years, so if I'm wrong somebody with more experience and current knowledge in this field please correct me...

Also, about the GPU/CPU debate... I think there are pros and cons against both, and nobody knows how it all turns out. Maybe they will even co-exist, at least for a while... As someone pointed out, a lot of it will have to do with where the money of the big investors goes, at least in the beginning, but in one thing Sweeney is definitely right: having one architecture to deal with is definitely better, and from the top of my head there is at least one more benefit of doing the calculations on the main processor: you have direct access to your system's RAM, without wasting time transferring and duplicating data between the GPU's and your main system's memory.

And as far as the difficulty of programming multi-core CPUs goes... Sweeney is aware of that and mentioned it as well at the end of the interview. That's one problem however I believe will be solved once the technology has been around longer and the right techniques and compilers had time to be developed. It always takes longer for the software to catch up with the hardware, but eventually it will happen. And if not. Well, personally I like challenges, and I gladly trade simplicity for more freedom... ^^

CHRiTTeR
09-16-2008, 07:06 PM
Wasn't the big limitation of voxel rendering (at least in those aformentioned games) the fact that you could rotate only around 2 axes?

Firts time i hear that?!
Both the exalmple mentioned earlier (comanche and outcast) you were able to rotate on all 3 axisses...

I am not a programmer, but i cant see why it would be a problem to rotate on 3 axisses when using voxels?

you have direct access to your RAM, without wasting time transferring and duplicating data between the GPU's and your main system's memory.

Thats one of the the main problems with doing raytracing on the GPU. You lose too much time because lots of data has to be swapped constantly between the 2, thats mainly why its a lot faster by doing it on the CPU

FlorinMocanu
09-17-2008, 08:30 AM
Certainly interesting. Of the current 3D applications, ZBrush is pretty much the only one (afaik) using software rendering as opposed to GPU hardware acceleration, and it's definately no slouch which is something that speaks in favour of this speculation. Currently your main CPU(s) certainly has more horsepower than that of your GPU, but the GPU is tailored for one task only which is to render graphics as fast as possible and this gives it a great bang for the buck despite it's by comparison meager cpu speed. However programming the GPU is of course less flexible than writing your own software renderer which unlike the GPU, doesn't have to offer a general solution but can instead be tailored exactly for what you need. Now, to efficiently exploit this possibility you will have have the know-how of how to write this perfectly-tailored-renderer and that is obviously ALOT harder than building your engine around OpenGL or DirectX hardware accelerated functionality wrapped up in a nice api. I doubt that the industry will forego the relative ease of development through OpenGL, DirectX in favour of flexible and 'tailored for the task at hand' faster software rendering. But time will tell.

In terms of raw processing power, GPU's are far more advanced than CPU's. A curent HD4870 stands at around 1200 GFlops of processing power, while a Xeon 5482 quad stands at a meager 50-60 GFlops. Not to speak of the Dual-GPU 4870X2 which stands at an impressive 2400 GFlops. And things will get even higher with the next gen GPU's.

Now, that is why CUDA apeared. So that programmers can take advantage of the huge raw power of GPU's using straight C language, and that's why you can now see Distributed computing done on GPU's much faster than on CPU's or you have video encoding starting to apear to take advantage of GPU's (see Badaboom).

fuss
09-20-2008, 04:15 AM
Firts time i hear that?!
Both the exalmple mentioned earlier (comanche and outcast) you were able to rotate on all 3 axisses...

I am not a programmer, but i cant see why it would be a problem to rotate on 3 axisses when using voxels?

You're right, my bad. Got some old memories mixed up ;). Algorithmically, it's never been a problem. It's just that in the very early days of 3D (and voxel rendering for that matter) on PCs it was too expensive to "roll" the camera in 3D view during real time renders (well, the "rolling" itself wasn't the problem, the speed of the rendering was ;)). Without going too much into details, it had to do with the way the frame buffer was/is structured. But it's days long past. You're right, I mixed some stuff up, it's been a long time. Never played Outcast but it worked in Comanche indeed... ;)

andrewjohn81
09-22-2008, 08:44 PM
this really just seems like a moot article. How many people still use 10 year old computers? LOTS! that means if something as life changing as this were to happen it would take a hella long time before really being adopted globally. Computers change fast compared to other things, but that just seems like a claim similar to "in 10 years we'll be flying our cars and they will use only solar power" As much as that could theoretically be possible, there are more than just physical problems to solve.
1 important person making a decision can easily effect the time line of something emerging over a year. And there are a Lot of "important" people involved in something like this.

FlorinMocanu
09-23-2008, 07:59 AM
I doubt "lots" of people use 10 year old computers. Maybe 4-5 year old ones, like my father, but not 10.

andrewjohn81
09-23-2008, 12:12 PM
no seriously 10. I bought a computer in highschool, '98, and it still works just fine. Remember those old things, the ones that came with modems? I just play with it to run Ubuntu, but I know lots of people who have computers that are that old. Not gamers, and Definitely not very many people at all that are around here, but I do know lots of people who have computers most of us would consider antiques.
As time goes on a 10 year old computer is starting to become less distant for many people. If all they need to do is browse the web they don't need anything special. Most people really don't demand supurb graphics or multiple processors. It just wouldn't make much difference for really simple computing. I think that shows a bit since more simple computers have been gaining acceptance. Eee PC, super small laptops that have nearly no power, "green" PC's, these aren't much better than a 10 year old computer but are working fine for many people. Maybe not for me since I have a bad habit of having every software I'm going to use for the day open for days at a time all at once, but for Many people these machines will be fine.

CHRiTTeR
09-23-2008, 03:47 PM
no seriously 10. I bought a computer in highschool, '98, and it still works just fine. Remember those old things, the ones that came with modems? I just play with it to run Ubuntu, but I know lots of people who have computers that are that old. Not gamers, and Definitely not very many people at all that are around here, but I do know lots of people who have computers most of us would consider antiques.
As time goes on a 10 year old computer is starting to become less distant for many people. If all they need to do is browse the web they don't need anything special. Most people really don't demand supurb graphics or multiple processors. It just wouldn't make much difference for really simple computing. I think that shows a bit since more simple computers have been gaining acceptance. Eee PC, super small laptops that have nearly no power, "green" PC's, these aren't much better than a 10 year old computer but are working fine for many people. Maybe not for me since I have a bad habit of having every software I'm going to use for the day open for days at a time all at once, but for Many people these machines will be fine.

10 years is a VERRY long time in computerland (and about 1/7 of our life, so quite long for us too :p ).
I know only 1 person who's using a 10 yr old system (no upgrades), but thats really an exception.
Its for his girlfriend to type stuff for school and play oldschool warlords (lol).
But he's going to get a new laptop for himself so she gets the old one and the 10 yr old system goes bye bye.

Maybe some old ppl who dont know anything about a computer have 10 yr old systems wich they barely use.
But then again because they dont know anything about it they need to upgrade relatively quicker because their system gets slow verry quick (they are the type that install all those little garbage tools on them who f*ck up yr system)

Keep in mind that 10 years ago, the 1GHz pentium3 with 256mb ram, a 60GB hard drive and a geforce2 was a pretty awesome machine.
You say thats enough to browse the net? One would think so, but pls do try it... ;)
Technologie on the internet has also grown, maybe a bit silently and stealthy but it certainly did.
For example: try to visit an up-to-date flashheavy site on a P3 1GHz (which was pretty much at that time). ;)


But if yr gonna type letters all day and check yr mail, its probably good enough (but dont try to run the new office on it! lol).

I doubt "lots" of people use 10 year old computers. Maybe 4-5 year old ones, like my father, but not 10.

indeed

andrewjohn81
09-23-2008, 04:04 PM
so what's the difference with those and, say, some of the computers that are coming out now withe the Atom processor, a rather small hard drive, and well...they do have like a gig of memory. But my point still stands for the most part. Even if the evolution is more like 5 or 6 years. It's not an instantaneous change. Development still takes a really long time.

CHRiTTeR
09-23-2008, 04:28 PM
Those are for ultra mobile pc's and smartphones ;)
They are new cpu's and they are build one a 45 nm CMOS process and also incorporate some newer tech the older CPU's didnt have, also a lot more power efficient too.
Hell they even have hyperthreading, lol

so not the same thing and comparing those is far from the correct thing to do. ;)

ambient-whisper
09-23-2008, 08:56 PM
just thought id mention that geforce 2 cards were available around 2000. not exactly 10 years ago. i remember because i had gotten a geforce 2 card and a pentium 3 800 around then, and had finished school that year too.

my first computer was exactly 10- 10.5 years ago. it was a pentium 2 266, 96mb ram, with an accelstar 2 videocard which sported 8mb ram. it was equivalent to an oxygen 3d card at the time. this machine was pretty much top of the line at the time. ( i think that the pentium 2 300 was the very top end, and 96 mb ram was a huge deal back then as most people only had 16mb at the time. )

i really doubt THAT many people are still using 10 year old machines. for a thousand bucks you can get yourself a cheapo hp. i had a friend who did this recently and inside it is 4gb ram, quad core, geforce 8500, and even a 22"hp monitor.

not bad for a grand at futureshop/bestbuy:D

CHRiTTeR
09-23-2008, 09:05 PM
yeah wasnt 100% sure about the age of the system, but i knew it certainly wasnt older...
Must be something like 8 yrs old then as my brother bought it a year or two after i got of school (wich was also 2000)

CGTalk Moderation
09-23-2008, 09:05 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.