CGTalk > Development and Hardware > Technical and Hardware
Login register
Thread Closed share thread « Previous Thread | Next Thread »  
 
Thread Tools Search this Thread Display Modes
Old 02-11-2013, 10:15 PM   #1
FuriousPanda
New Member
portfolio
Damon Westenhofer
Illustrator and 3D Animator
Vest Advertising
Louisville, USA
 
Join Date: May 2010
Posts: 6
Send a message via AIM to FuriousPanda
Your opinions/experiences with graphics cards GTX/Quadro/AMD

I know the graphics card question seems to get asked over and over. So I'm sorry about that, but I have a specific question.

I have been doing quite a bit of research and searching through forums to get opinions from other CG professionals. I want to build a new workstation for Zbrush, Maya, Vray, Max, CS, and I'd like to be able to do some modding and gaming as well.

I hear (here and elsewhere) the current Nvidia 6xx series is supposed to be crap in the Maya viewport, so that's out. I'm looking at the Quadro 4000, but I've been reading some speculation that Nvidia is capping out performance in order to try to push adding Tesla cards. I still feel like the Quadro card is probably my first choice though. I thought about AMDs cards as well, but I don't know how those work on some of the programs/games. I have a Radeon in my work Mac and that seems to handle my 3D apps pretty well, hear they're spotty on games though.

A few people in the business keep suggesting the Nvidia 570 to me. It seems like a step backward but I guess that generation of cards really wasn't produced that long ago... Would I be better served to buy the Quadro 4000 or buy a 570 or 580 and put the extra money into CPU and ram? I know I'd take a hit in card ram unless I got something like the 580 Classified. Also anyone using AMD cards for both 3D and games I'd love to hear your experiences. I'm just looking for opinions and experiences from you all and thank you very much in advance.
__________________
Damon Westenhofer
Tumblr
Twitter
LinkedIn
CGhub
DeviantArt
 
Old 02-12-2013, 02:00 AM   #2
ThE_JacO
MOBerator-X
 
ThE_JacO's Avatar
CGSociety Member
portfolio
Raffaele Fragapane
That Creature Dude
Animal Logic
Sydney, Australia
 
Join Date: Jul 2002
Posts: 10,950
Lets simplify things:
Zbrush doesn't use the videocard much.
Adobe, when it uses GPU acceleration, does so through CUDA (on windows, on Mac the situation is sort of reversed as it uses OCL, but support is poorer), which means a K tech nVIDIA card is your best bet (so 6xx).
Max I honestly don't know anything about, first or second hand, I just don't follow.
At home I use a gtx680 and use Mudbox, Houdini, Softimage and Maya.

I've heard the rumors, but I found out a good combo of drivers and settings early for the 680, and can't say performance in Maya or Soft was anywhere near bad, it's definitely leaps and bounds better than the quadro 4000 I used at work for a while.

I have no doubts the people saying they had a bad experience with the 6xx gen on Maya are telling the truth, I just didn't have the issue, but I'm not a benchmark freak, the obvious usability issues usually associated with a so called gaming card (latency in the viewport, bad sorting etc.) were DEFINITELY absent for me though.

Countrary to some others I had a really horrible experience with the 5xx test model, and was happy before with a 260 and a 480.

I've had mixed luck with ATI/AMD, and still wouldn't really recommend the gaming cards for DCC work at home in such a (for now) nVIDIA dominated field.

For games AMD is generally, and rightfully, considered better bang for buck in the mid and mid-high range.

Weigh all of the above, and see how important each point is for you.
__________________
"As an online CG discussion grows longer, the probability of the topic being shifted to subsidies approaches 1"

Free Maya Nodes
 
Old 02-12-2013, 04:12 AM   #3
tswalk
Lord of the posts
 
tswalk's Avatar
portfolio
Troy Walker
USA
 
Join Date: Jan 2012
Posts: 708
there are a few "glitches" when working with the GTX cards with Maya... mostly due to the fact that it only supports a single overlay whereas the Quadros support something like 8 (can't find doc off hand)... no clue about AMD overlay support, but I'm sure it is similar situation between the consumer and pro grade.

what does this mean?

well... most often, it doesn't mean much (especially when playing games). but often times for DCC apps, certain operations like viewport performance (creates an overlay), and multiple windows (creates an overlay for each) or lists, and panels (again overlay for each).. performance takes the hit. and may also (more often than many realize) the system and program can become unstable or sluggish.

i think this is where people tend to start calling GTX cards crap... they feel because the GPU is the same as the professional cards, it should be capable of doing the same operations. well, they just simply haven't read the white papers on them to understand the real differences.

However, I've seen and read cases where multiple GTX cards are utilized for IPR (or with iray, vray, etc.. ) and they are impressive. So, i have no reason for the bias.

hell, i use a GTX 560... its' not awesome, but does the job and I understand the limits.
__________________
-- LinkedIn Profile --
-- Blog --
-- Portfolio --
 
Old 02-12-2013, 05:37 AM   #4
FuriousPanda
New Member
portfolio
Damon Westenhofer
Illustrator and 3D Animator
Vest Advertising
Louisville, USA
 
Join Date: May 2010
Posts: 6
Send a message via AIM to FuriousPanda
Thanks guys. It's nice to get a little bit of a positive review for the gaming cards and to hear about how exactly overlays work. I hadn't really considered options windows and lists as an overlay issue that would affect performance.

I had read that Zbrush didn't use the video card much and that's the big reason I'd like to put as much money as possible into the CPU and ram. I plan on using the system mainly for Zbrush and Photoshop/Painter illustration, Maya and Max a little and then mostly for modeling and Vray rendering. But then again I shouldn't downplay how much I use Maya and Max because I'm sure I'll be using them quite a bit more than I think. I'm trying to transition more of my 3D workflow into Zbrush and have just been lacking the capacity in my personal system.

I've also read some downplaying of SLI setups as not helping much but gaming and having some other issues besides, however if I can boost my Vray renders that way I'll do it. I have an SLI setup in my current outdated system and never really had any problems.
__________________
Damon Westenhofer
Tumblr
Twitter
LinkedIn
CGhub
DeviantArt
 
Old 02-12-2013, 08:19 AM   #5
ThE_JacO
MOBerator-X
 
ThE_JacO's Avatar
CGSociety Member
portfolio
Raffaele Fragapane
That Creature Dude
Animal Logic
Sydney, Australia
 
Join Date: Jul 2002
Posts: 10,950
You are confusing quite a few things, first and foremost overlays and VBOs. On top of that, you are going by something that's dated, and has been for a while.

Overlays are an antiquated technique to push multiple draws straight through the hardware.
Not only it's an obsolete technique at an OS level, it's also disappearing from softwares.

Save for some very dated features, the use of overlays has been on its way out for a while.

To my knowledge, maya on windows hasn't used HW overlays for OGL feeds for two or three years now
__________________
"As an online CG discussion grows longer, the probability of the topic being shifted to subsidies approaches 1"

Free Maya Nodes
 
Old 02-12-2013, 02:50 PM   #6
FuriousPanda
New Member
portfolio
Damon Westenhofer
Illustrator and 3D Animator
Vest Advertising
Louisville, USA
 
Join Date: May 2010
Posts: 6
Send a message via AIM to FuriousPanda
Thank you both for the info, it's very helpful getting everyone's personal experience. I did a ton of research and once I started asking people's opinions and experience I feel like I should have just done that first.

ThE_JacO do you do mostly creature work? That's predominantly what I do/will be doing.
__________________
Damon Westenhofer
Tumblr
Twitter
LinkedIn
CGhub
DeviantArt
 
Old 02-12-2013, 09:38 PM   #7
ThE_JacO
MOBerator-X
 
ThE_JacO's Avatar
CGSociety Member
portfolio
Raffaele Fragapane
That Creature Dude
Animal Logic
Sydney, Australia
 
Join Date: Jul 2002
Posts: 10,950
Personal research time is never wasted, personally I tend to be way better inclined to offer my opinion or advice to someone who's done some than to someone who wants me to do their homework for them

I mostly do creature work, I juggle anim pipe, character and rigging supervision on some shows and fall back to trench line work between the ones I supe.
Depending on what show and what point in the show I could be doing anything from design work to writing software, towards the end mostly meetings and writing angry e-mails though

At home, when I bother, it's a mix of design work and dev work, or keeping up to date with software for the sake of it (IE: Houdini, which I don't get to use at work).
I seldom rig at home unless it's specifically to test something I'm writing. Some post and video editing too (Sony Vegas) but minor stuff, and some games (although I tend to play on a console in the living room more if I feel like games).

Both Linux and Win7. I'd say I put my videocards through enough variety, and do have professional level expectations from them, but I'm not a hardware nut or obsessive about every fps and cycle as long as things feel smooth.
__________________
"As an online CG discussion grows longer, the probability of the topic being shifted to subsidies approaches 1"

Free Maya Nodes
 
Old 02-12-2013, 09:55 PM   #8
FuriousPanda
New Member
portfolio
Damon Westenhofer
Illustrator and 3D Animator
Vest Advertising
Louisville, USA
 
Join Date: May 2010
Posts: 6
Send a message via AIM to FuriousPanda
Well I have certainly been doing a lot. And I still can't make up my mind. But I really value your input and am glad to know that GTX cards could still be an option. I'm still digging through forums and I'll probably make my decision in a few days and order it/build it.

Sounds like you do an impressive amount of work, mind if I ask what shows you work on?
__________________
Damon Westenhofer
Tumblr
Twitter
LinkedIn
CGhub
DeviantArt
 
Old 02-13-2013, 05:28 AM   #9
tswalk
Lord of the posts
 
tswalk's Avatar
portfolio
Troy Walker
USA
 
Join Date: Jan 2012
Posts: 708
well.. someone needs to inform OpenGL committee to stop trying to add support for hardware overlays then...

http://www.opengl.org/documentation...ec3/node29.html



i'm not sure what you mean by VBOs... but the purpose for hardware based overlays is to prevent entire buffer refreshes for small transitions/changes. hence the performance increase for using them... they're not that old and are still in use as far as i'm aware.

unless some new kinda of hardware programming gremlin has determined a better way to buffer graphics changes... maybe holy oil, or magic water.
__________________
-- LinkedIn Profile --
-- Blog --
-- Portfolio --
 
Old 02-13-2013, 01:54 PM   #10
ThE_JacO
MOBerator-X
 
ThE_JacO's Avatar
CGSociety Member
portfolio
Raffaele Fragapane
That Creature Dude
Animal Logic
Sydney, Australia
 
Join Date: Jul 2002
Posts: 10,950
Quote:
Originally Posted by tswalk
well.. someone needs to inform OpenGL committee to stop trying to add support for hardware overlays then...

http://www.opengl.org/documentation...ec3/node29.html


They are available, and they are still widely used in linux, but not so much on windows and mac.
They are worthy of support, but pointless if we're discussing maya for windows here, which I thought we were

Quote:
i'm not sure what you mean by VBOs... but the purpose for hardware based overlays is to prevent entire buffer refreshes for small transitions/changes. hence the performance increase for using them... they're not that old and are still in use as far as i'm aware.

They are that old, and are disappearing, thank God.
I am familiar with what they are and how they work, and have done my share of OGL programming.
From Vista and from OS-X 10.2 they get replaced by full hardware compositing, a less capped and more efficient way to deal with the problem.
Incidentally, this is also why disabling Aero, which many people misguidedly assumed to be just a resource hog, was often a loss of performance.

Quote:
unless some new kinda of hardware programming gremlin has determined a better way to buffer graphics changes... maybe holy oil, or magic water.

Yeah, the hardware programming gremlins have found out a way to not require multiple hardware overlays management a while ago

Here, this is mostly what replaces them in the case of multiple windows you were mentioning:
http://en.wikipedia.org/wiki/Desktop_Window_Manager
Accelerated Quartz on Mac, Compiz on Linux and so on if you fancy reading about other platforms.

Maya 2011 caught up to the OS, making it mid 2010 (stupid AD versions) that hardware overlays aren't the dividing factor between pro and gaming cards anymore.
__________________
"As an online CG discussion grows longer, the probability of the topic being shifted to subsidies approaches 1"

Free Maya Nodes
 
Old 02-13-2013, 02:56 PM   #11
tswalk
Lord of the posts
 
tswalk's Avatar
portfolio
Troy Walker
USA
 
Join Date: Jan 2012
Posts: 708
well DWM (quartz, etc.) just simplified things for the gremlins, but they just obfuscate the complexity of having to make your own buffer management... but what do i know, i'm not a programmer.. been 10 years since i did any DirectX programming.
__________________
-- LinkedIn Profile --
-- Blog --
-- Portfolio --
 
Old 02-13-2013, 10:02 PM   #12
imashination
Expert
 
imashination's Avatar
portfolio
Matthew ONeill
3D Fluff
United Kingdom
 
Join Date: May 2002
Posts: 8,747
Everything Jaco says. Theres an extremely old document kicking around from 2004 that gets linked to every now and then which extols the virtues of the quadro over the geforce cards. It lists hardware overlays and line antialiasing. Both techniques which have been dropped long ago by every 3d app except a few creaky CAD programs.

Line AA has been superseded by FSAA and its numerous variants. Its simply quicker to AA everything than it is to start dicking around with individual lines trying to make them look better. The same with overlays; they used to be needed to avoid slow refreshes, but various other techniques have been made over the years which means its really a non-feature these days.

Actually that document is quite funny, its so outdated, that theyre telling us of the advantages of using AMD processors with nvidia gfx cards ;-)

ftp://download.nvidia.com/ndemand/Q...ro_Benefits.pdf
__________________
Matthew O'Neill
www.3dfluff.com
 
Old 02-13-2013, 10:54 PM   #13
ThE_JacO
MOBerator-X
 
ThE_JacO's Avatar
CGSociety Member
portfolio
Raffaele Fragapane
That Creature Dude
Animal Logic
Sydney, Australia
 
Join Date: Jul 2002
Posts: 10,950
Quote:
Originally Posted by tswalk
well DWM (quartz, etc.) just simplified things for the gremlins, but they just obfuscate the complexity of having to make your own buffer management... but what do i know, i'm not a programmer.. been 10 years since i did any DirectX programming.

They did simplify, but didn't obfuscate, it's a paradigm shift from split buffer management to a unified buffer. It simply and just doesn't work the way HW overlays used to deal with it, that's the gist of it.

You don't pass windows and rulesets to the hardware to sort and arrange and render for you, the OS has taken that over, not unlike other things that used to be HW only and are now simply more convenient (not to mention unobjectably faster) to let a unified model grunt through otherwise.

The long story short is that HW overlays don't make a difference in well written, OS close software. The differences, when present, between quadro and geforce are entirely artificial and driver dependent on the software (as in the higher abstraction part of the driver) side, and not on a few lines cut on the PCB or the driver culling part of the HW functionalities.

The fact it's still supported in GLUT doesn't mean much other than the fact that A) OGL is historically a backward compatible standard, and therefore will leave legacy alive for a very long time, and B) that in some unix enviornment and some narrowly dedicated applications it's still used and viable, because not every desktop out there is rendered through a unified buffer.
__________________
"As an online CG discussion grows longer, the probability of the topic being shifted to subsidies approaches 1"

Free Maya Nodes
 
Old 02-13-2013, 10:54 PM   #14
CGTalk Moderation
Lord of the posts
CGTalk Forum Leader
 
Join Date: Sep 2003
Posts: 1,066,481
Thread automatically closed

This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.
__________________
CGTalk Policy/Legalities
Note that as CGTalk Members, you agree to the terms and conditions of using this website.
 
Thread Closed share thread


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright 2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 02:13 AM.


Powered by vBulletin
Copyright ©2000 - 2016, Jelsoft Enterprises Ltd.