|12 December 2008||#1|
Winter Park, USA
I7 or 790i for multi card cuda?
Ive been researching CUDA and GPU rendering lately and im about to be in the market to upgrade my workstation, im wondering which motherboard options will be most compatible with this technology in the next 2 to 3 years. (2009-2012) I do a lot of rendering and would really prefer to get as many CPU's and cores as possible but i dont want to trade CPU power for GPU connectivity if the industry is about to change to GPU based rendering.
Ive seen dual and even quad chip MOBOs on the market but they general have Intel north bridges and arent designed for advanced graphics so they seem ill suited for my purposes but if im mistaken in this please tell me. Next up is the i7 cpu (LGA 1366) which only comes with an intel north bridge but has triple channel ram and all around higher numbers vs (LGA 775) with the nForce 790i ultra SLI north bridge, it would seem that nvidias north bridge should work best with GPU rendering but perhaps im wrong?
To summarize, Does CUDA and GPU based rendering (Gelato etc) require certian north bridges to function at maximum capacity (assuming multiple cards) and does the potential of GPU rendering trump multi-CPU in the next 2 to 3 years? Is there ANY benefit to using MOBO's with the 790i northbridge over the i7 with its intel northbridge?
|12 December 2008||#2|
manchester, United Kingdom
ok crystal ball time now,its virtually impossible to predict who or what is going to happen in the pc market in the next 2-3 years.
as nvidia now own mental ray i'd imagine that somewhere down the line cuda will be integrated with it.gelato is now dead in the water
it looks like the next version of direct x will also have gpgpu functionality so this should widen the market.
ati have there own version of course though there are rumours that cuda could be ported to it.
apple are i think looking at using opencl.
then there's intel with its supoosedly nvidia killing larrabee.
so as you can see your choice is varied as to which one wins out if any.who knows..
|12 December 2008||#3|
This should maybe be a different thread/discussion than is indicated in the title of the OP's thread, but I'll give my perspective on the GPU rendering situation.
To date, no company has demonstrated the capability to develop a cinematic-quality, production-ready rendering engine that runs entirely (or primarily) in "hardware" such as a GPU.
The main challenges seem to be that commercially viable, "general purpose" 3D rendering engines -in order to be widely accepted- need to exhibit the following qualities:
1. flexible & extensible shader options and scene processing parameters for the widest range of creative and aesthetic possibilities
2. acceptable overall rendering performance (i.e. speed) across a wide range of hardware options and scene construction methodologies
3. platform independence
Today, there is no GPU-centric renderer on the market that I know of which provides all three of these. In fact, there's nothing that really even comes close.
Developing products that can bring the power of GPUs to "cinematic rendering"* is apparently going to be harder than many had hoped.
There are development teams working on solving the challenges of making better use of GPU's for general-purpose rendering, but if it were easy, there would already be products out there...
* - for arguments' sake, let's define "cinematic rendering" as rendered imagery having the equivalent qualities as that having come from modern software renderers such as mental ray, Renderman (and RIB-compatible renderers), C4D and other high-quality raytracers -or "physically-correct" renderers such as Maxwell.
Last edited by BOXXlabs : 12 December 2008 at 06:25 PM.
|12 December 2008||#4|
Thread automatically closed
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.
Note that as CGTalk Members, you agree to the terms and conditions of using this website.
|Thread Closed share thread|