PDA

View Full Version : Now were talking! Direct X 11 To Use GPU For Parallel Processing


Szos
07-23-2008, 05:02 AM
http://gizmodo.com/5028013/microsoft-directx-11-to-use-gpu-for-parallel-processing

Direct X 11 To Use GPU For Parallel Processing (http://gizmodo.com/5028013/microsoft-direct-x-11-to-use-gpu-for-parallel-processing)



Direct X (http://gizmodo.com/tag/direct-x/) 11 is coming, and it looks pretty awesome. Sure, you get advancements in shading and better support for multi-core machines, but what's really got our heads turning is the concept of letting programmers use the GPU in your video card to do some of the heavy lifting, meaning your graphics chip becomes a second, parallel processor. While the idea itself isn't new this is the first we've heard of Direct X using such technology and we're sure it'll have PC gaming fanboys drooling when it rolls out, whenever that happens to be.

Using the GPU for processing data is not exactly new, but having the weight of MS behind one standard way of accessing that power is what will FINALLY push this technology from a niche into the mainstream. Right now there is way too much BS concerning what brand of videocard you have (i.e. stuff like Gelato obviously wouldn't work with ATi cards) and then what model card you are running ("pro" versus gaming). This technology has needed one big name within the industry to guide developers, and looks like DX11 might be it. Very cool news.

BeBraw
07-23-2008, 06:19 AM
Khronos group is going to the same direction. See http://www.khronos.org/news/press/releases/khronos_launches_heterogeneous_computing_initiative/ .

Ivan D Young
07-23-2008, 04:49 PM
The spec has not been completed for Direc X 11, Microsoft is rumored to be wanting to add things like Displacement and maybe Subdivisional surfaces. Again this is rumor, but it is interesting. Direct X 11 could be a huge jump in technology that the industry would have to want to take advantage of. Other wise we could see tools shipping with game engines doing as much or more than conventional tools. Then we would be competing at the lower end with twelve year olds, oh boy!

ChrisDNT
07-23-2008, 07:26 PM
Does this mean that the graphic cards, under DX11, could be used for 3d rendering calculations too?

rebb
07-23-2008, 08:19 PM
Sounds like Cuda :p

Bullit
07-23-2008, 08:20 PM
Yes it means that. There are several instances where Game engines are used for Render mainly for interactive Architecture work.

BColbourn
07-23-2008, 08:21 PM
the new shader technology seems confirmed.

"New compute shader technology will be available for developers to one day use a system's GPU as a parallel processor, and tessalation, which, according to the press release Big Download received via email, "blurs the line between super high quality pre-rendered scenes and scenes rendered in real-time," will also be available."

see: http://news.bigdownload.com/2008/07/22/microsoft-announces-directx-11-details/

Szos
07-24-2008, 12:16 AM
Khronos group is going to the same direction. See http://www.khronos.org/news/press/releases/khronos_launches_heterogeneous_computing_initiative/ .

For GPU-rendering to FINALLY become a reality, it has to be supported by the single most powerful company in the computer world. The whole reason that this tech hasn't advanced beyond the most basic stages is simply because everyone and their mother seems to have their own way to doing it which is of course incompatible with everyone else's way of doing it - which makes all these techniques pointless because they have to be supported by mutliple software and hardware companies at the same time. What are the chances of competing software and hardware companies working together to get this technology going? Yeah, exact... zero to none. MS is one of the few companies that could pull this off.

inguatu
07-24-2008, 01:33 AM
MS is one of the few companies that could pull this off.

yeah if only MS haters and Apple and Linux fangirls don't whine about it simply because Microsoft is doing something positive. Wait.. that won't happen, they'll still whine. Charge on MS!!!

Sonk
07-24-2008, 02:34 AM
Microsoft is rumored to be wanting to add things like Displacement and maybe Subdivisional surfaces. Again this is rumor, but it is interesting. Direct X 11 could be a huge jump in technology that the industry would have to want to take advantage of. Other wise we could see tools shipping with game engines doing as much or more than conventional tools. Then we would be competing at the lower end with twelve year olds, oh boy!

Not quite a rumor it seems:

http://www.linkedin.com/pub/5/3B8/046

"Work with Microsoft on DX11 for supporting subdivision surface for higher degree of realism through graphics processor."

zukezuko
07-24-2008, 02:06 PM
Tim Sweeney of Epic Games said in an iterview :

Q. What are your feelings on displacement mapping?

A. Much ado is made about this feature, but ultimately displacement mapping is a form of geometry compression, and to realistically assets its benefits and drawbacks we must compare it to other geometry-compression schemes. In that regard, itís a pretty crappy form of geometry compression! It requires a parameterization of the underlying surface (which itself imposes significant burdens on artists to create an artifact-minimizing mapping), and to hide the seams, and has a directional bias often unrelated to the underlying geometry.

Indeed, there will someday be a revolution in fine tessellation of objects with sub-pixel triangle rendering, but displacement mapping wonít be the magical feature that empowers it. More realistically, GPU makers talk about displacement mapping because itís a thing they know how to easily fit into their existing pipeline. Much of the modern graphics pipeline is derived from such expedience rather than a thorough analysis of how we might maximize rendering detail with the minimum hardware cost.

http://www.evga.com/gaming/gaming_news/gn_100.asp

also he predicts the end of directx here:


I think that DirectX 10 is the last DirectX graphics API that is truly relevant to developers. In the future, developers will tend to write their own renderers that will use both the CPU and the GPU - using graphics processor programming language rather than DirectX. I think we're going to get there pretty quickly.

and

There are significant advantages in doing it yourself, avoiding all the graphics API calling and overhead. With a direct approach, we can use techniques that require wider frame buffer, things that DirectX just doesn't support. At Epic, we're using the GPU for general computation with pixel shaders. There is a lot we can do there, just by bypassing the graphics pipeline completely.

both above found here :
http://www.tgdaily.com/content/view/36410/118/

If you ak me they better improve 10 before going to 11 ... 10.1?

BColbourn
07-24-2008, 05:23 PM
so since mr sweeney doesnt seem to have much faith in the longevity of directx, does that mean epic is working on their own directx competitor? UE4? please?

P_T
07-24-2008, 05:58 PM
How many PC games are actually optimised for multi cores and 64 bit? Not to mention that even if this move is considered to be a positive one, it'll still have little impact if it's made exclusive to MS future OS like the current DX10 to Vista.

BColbourn
07-24-2008, 06:03 PM
P_T: it could potentially be used on microsoft's next console as well.

Ivan D Young
07-24-2008, 06:35 PM
Yeah Sonk, I was trying to be responsible with my post and then Microsoft officially announced some of thier specs, Oops. Serioulsy, though How fast is the 3D app companies and associated companies going to be able to jump on this. If a Game engine was shiping with all inclusive tools, they would be able to surpass certain 3D apps out of the gate.
The real time feedback and real time rendering would ceratinly be an enticing feature to make any artist want to take advantage of that.
The Software people in this field are really going to have to focus on delivering new software that takes advantage of all this power that sits in our computer and really only works when we render.
One thought that has always bugged me, how in the hell can Adobe not be right on top of this technology? They are the largest software company in the world that does not make an OS, you do not get that big and stay that big by staying slow and lethargic, or do you?

last thought, what do peolpe think, do you think when all these mulit Cores and multi GPUs and Direct X 11 are available that we could see some sort gaming app that could eat into this industry with much younger talent doing the driving? With real time this and Real time that, all they would have to do is record.

UrbanFuturistic
07-24-2008, 06:37 PM
P_T: it could potentially be used on microsoft's next console as well.That'll be a few versions down the line.

As it is, DirectX isn't going anywhere despite what anyone from Epic might say; Windows Direct3D support from drivers (for starters) is just better than the OpenGL support in a lot of cases, enough cases that it has to be supported or a game will just fall over on a lot of computers. Not that you'd notice with decent drivers but there's enough go-se out there that it can't be ignored so Mr Hardcore Gamer with the latest nVidia graphics will run any OpenGL based game just fine but anyone with an integrated graphics chipset could be in for a rough time.

Honestly, if I was doing a 3D based game, I'd probably choose Direct3D over OpenGL. Fortunately there are 3D engines that let you support both from a common codebase and only get specific when you need to.

That asides, didn't someone already get subsurf rendering using advanced shaders like HLSL/GLSL? I could swear that was posted somewhere on here.

P_T
07-25-2008, 03:56 AM
That asides, didn't someone already get subsurf rendering using advanced shaders like HLSL/GLSL? I could swear that was posted somewhere on here.I remember seeing that in the old ATi RADEON 9800pro tech demo.

FreakWizz
07-25-2008, 04:10 AM
The problems we mainly have in the CG professional arena, is OGL is the defacto standard for Professional applications. And while it may still have some advantages and i personally prefer it for programming code in. Unfortunately over the last few versions i feel D3D/DX is the superior technology even for 3D applications. The consumer level videocards are optimized for D3D, and as such D3D performs better on a Windows system than that of much more expensive, less featured OGL workstation card.

In a time when Linux and MacOS are seeing increased sales, that makes it hard for developers who would prefer to support OGL for it's multiplatform and open standards.

I think however the tide is stemming and if I'm stuck with Windows anyway, I would like the fastest and better technology used not just in games but in the professional programs that are used in games and film anyway. So i say bring on DX11, and bring support from the professional 3D application market, while we are there... Make it more than a toy this time. Maya, C4D, XSI, LW, Modo, Mudbox,.... With D3D Viewports Please!

OGL is dead, long live OGL! ;)

BColbourn
07-25-2008, 04:24 AM
here's a tessellation tech demo: http://www.gametrailers.com/player/20445.html
this one isn't directx 11, but a very cool real time indirect lighting tech demo: http://www.gametrailers.com/player/usermovies/233999.html

KayosIII
07-25-2008, 11:49 AM
The problem for us Linux and Mac types is not that Microsoft never does anything right. Its that when the Microsoft offering becomes the adopted standard it leaves us out in the cold and quite frankly not having a choice really really sucks. While the one choice you do have may be reasonable at the moment lack of choice and competition tends to lead to market stagnation. Though given this is new tech that will prob be a while off.

On the whole though it is good to see GPGPU gaining traction as GPU's seem to be remarkably quicker at some tasks - I was a bit puzzled as to why microsoft would make this part of directx since under most usage scenarios somebody using writing directx applications is probably going to be using the GPU moreso that the cpu already. But then direct x is about direct access to hardware. So I guess that makes sense.

CGTalk Moderation
07-25-2008, 11:50 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.