CGTalk > Technical > Technical and Hardware
Login register
reply share thread « Previous Thread | Next Thread »  
 
Thread Tools Search this Thread Display Modes
Old 02-15-2017, 06:56 AM   #1
CanamAldrin
Explorer
portfolio
 
Join Date: Feb 2017
Posts: 4
Quadro: What am I missing out on?

Hi Friends,

I'm about to commit to a GPU upgrade and am struggling on whether to go GTX 1080 or spring for Quadro P5000 or P6000. I know there's been discussion here before on GeForce vs Quadro, but this is with a slight twist to the question.

Previous comments seem to say something along the lines of Quadro is not worth it for DCC (Maya / ZBrush is my bag, occasionally some Max) "unless you need those special features that only it supports". But what I never hear much on are specifics of what those "special features" are. The one specific I read in a post mentioned that Maya legacy viewport dogged performance on GeForce vs Quadro. In the past I used to hear that only Quadro would do anti-aliased lines, or that GeForce didn't accelerate windowed viewports, only fullscreen (i.e. games). But then I've recently read somewhere that the Pascal GeForce cards/drivers have really lifted some of the artificial caps Nvidia has placed on OpenGL performance, so maybe this stuff no longer is true.

The other concern I have is about GPU-based rendering. I've done almost zero of it, but want to start dabbling to see if it can be helpful to me. So that is another consideration in my choice, although I will build a GPU render system for it if I get really serious about it.

TL;DR - Can someone demystify for me what the "special features" are that Quadro supports for me in Maya primarily, and in GPU rendering, and I suppose ZBrush too (but I'm doubting they expect ZBrush artists to run Quadros).

THANK YOU!!!
 
Old 02-16-2017, 05:33 PM   #2
cgbeige
Expert
 
cgbeige's Avatar
portfolio
Dave Girard
Opinions are mine. You can't have them.
Oakland, USA
 
Join Date: Jul 2005
Posts: 7,020
ZBrush doesn't use the GPU at all so you don't want to bother for that. I'll leave the other stuff to others but this discussion has been had a ton here so do a search
 
Old 02-16-2017, 05:54 PM   #3
CanamAldrin
Explorer
portfolio
 
Join Date: Feb 2017
Posts: 4
Thanks for the info about zBrush. I did some thorough searching of the forums here before posting. I don't think I'm rehashing old ground, I'm asking something a little more specific / different. The posts I found while searching say Quadro's not worth it in DCC apps "unless you need the special features" it has, but hardly any details on what those special features actually are. So that is what I'm hoping to get some details on.

Also, like I mentioned, that the new Pascal GeForce cards I've heard have better OpenGL drivers, so perhaps there's new info on this too?
 
Old 02-16-2017, 07:09 PM   #4
olson
Houdini|Python|Linux
portfolio
Luke Olson
Dallas, USA
 
Join Date: Jan 2007
Posts: 2,974
You can get a lot more memory on the Quadro cards than you can the GeFeforce cards. This comes into play mostly with GPU rendering where the amount of GPU memory can literally be a show stopper. For example 24GB on the Quadro P6000 versus 8GB on the GeForce GTX 1080.

There was a time when some applications required workstation cards for everything to work. Maya was well known for being picky about the GPU ten years ago. Now days most applications have redone their viewport code so they don't rely on workstation GPU specific features.

Along with this shift in viewport technology has come a shift in Nvidia product offerings. The workstation GPU stuff used to have the same memory and everything as the gaming cards (just different firmware and drivers). They seem to be offering a lot more for the same price compared to ten years ago but they're still really expensive if you don't need the features they offer (like more memory).
__________________
http://www.whenpicsfly.com
 
Old 02-16-2017, 07:24 PM   #5
Srek
Some guy
 
Srek's Avatar
CGSociety Member
portfolio
Björn Dirk Marl
Technical Design
Maxon Computer GmbH
Friedrichsdorf, Germany
 
Join Date: Sep 2002
Posts: 11,465
Many large CAD applications are only supporting Quadros as certified hardware. In the past Nvidia only offered software vendors, other than games, to certify their software for Quadros, regardles how well they worked with GeForce cards.
This is one of the key elements of the "you have to get a Quadro for pro use" myth.
Then there were often some high end features like quad buffering, that can be important for non gaming stereoscopic solutions, that are not supported by Nvidia for GeForce cards.
There are claimed advantages regarding speed and reliability when it comes to software like max or Maya, but to the best of my knowledge they are hearsay.
There is one real difference and that is the priority with which Nvidia fixes bugs in drivers. Quadro and GeForce drivers ar emaintained by different teams and the GeForce team focuses on game support while the Quadro team focuses on 3D apps (CAD, DCC and others). It can often take longer to get a DCC application related driver bug fixed for GeForce than for Quadros.
I used both Quadros and GeForce over the last 16 years and the only thing i realy had problems with was the bug fixing priority Nvidia has.
If i have to pay for it myself i would never buy a Quadro, the advantages are to little for the price premium.
__________________
- www.bonkers.de -
The views expressed on this post are my personal opinions and do not represent the views of my employer.
 
Old 02-16-2017, 10:44 PM   #6
CanamAldrin
Explorer
portfolio
 
Join Date: Feb 2017
Posts: 4
Thanks for the replies. That's what I was looking for, some details on what exactly you get with Quadro. It helps to hear how things have changed in recent years. I think I was holding on to some perceptions of yesteryear.

As for myself, I placed an order for a Titan X (Pascal). I figure the 12GB RAM should be enough for me to experiment with GPU rendering for now. It's an expensive card, but cheap compared to what I got used to spending on Quadro 5xxx and 6xxx cards in the past.
 
Old 02-19-2017, 08:00 PM   #7
SD3D
Expert
portfolio
 
Join Date: Oct 2015
Posts: 254
10-bit colour per channel for OpenGL applications. Also referred to as 30-bit colour. You'll need a 10-bit monitor to benefit from this and unless you are a photographer that prints, its kind of unnecessary.

Apparently they offer "Double precision" which means they are more accurate at representing data.

If you use direct3d for your viewport, Quadros offer no benefits.

I think Cinema 4D is an odd one in that it might benefit from a Quadro in some tasks, but a Geforce in others. Not sure about that one
 
Old 02-19-2017, 09:25 PM   #8
CanamAldrin
Explorer
portfolio
 
Join Date: Feb 2017
Posts: 4
You get 10-bit with GeForce now too. I'm running 10-bit from my 1080 to an LG monitor as I type this...
 
Old 02-20-2017, 12:36 AM   #9
SD3D
Expert
portfolio
 
Join Date: Oct 2015
Posts: 254
Quote:
Originally Posted by CanamAldrin
You get 10-bit with GeForce now too. I'm running 10-bit from my 1080 to an LG monitor as I type this...


Not for OpenGL, I think its fullscreen directX only. Computer games, but not Photoshop.
 
Old 02-20-2017, 05:10 AM   #10
Srek
Some guy
 
Srek's Avatar
CGSociety Member
portfolio
Björn Dirk Marl
Technical Design
Maxon Computer GmbH
Friedrichsdorf, Germany
 
Join Date: Sep 2002
Posts: 11,465
Quote:
Originally Posted by SD3D
I think Cinema 4D is an odd one in that it might benefit from a Quadro in some tasks, but a Geforce in others. Not sure about that one

Quadbuffering for stereoscopic viewport is the only thing i am aware of.
__________________
- www.bonkers.de -
The views expressed on this post are my personal opinions and do not represent the views of my employer.
 
Old 02-20-2017, 08:31 AM   #11
SD3D
Expert
portfolio
 
Join Date: Oct 2015
Posts: 254
Quote:
Originally Posted by Srek
Quadbuffering for stereoscopic viewport is the only thing i am aware of.


Here is what nvidia says about it
 
Old 02-20-2017, 08:51 AM   #12
Srek
Some guy
 
Srek's Avatar
CGSociety Member
portfolio
Björn Dirk Marl
Technical Design
Maxon Computer GmbH
Friedrichsdorf, Germany
 
Join Date: Sep 2002
Posts: 11,465
Quote:
Originally Posted by SD3D

Cinema 4D does not use 10 Bit output.
__________________
- www.bonkers.de -
The views expressed on this post are my personal opinions and do not represent the views of my employer.
 
reply share thread



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright ©2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 06:20 AM.


Powered by vBulletin
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.