Asus nvidia GTX 690 Graphics card advice


#1

Does anybody use the Asus Nvidia GTX 690 Graphics card with Maya? If so does it improve the performance of Maya? I have this card but am not sure if it is worth using as I am not a game player but a CGI animator and would like to know whether or not this will benefit me.

Thanks


#2

well , if you use 3ds max that it based on directx you almost dont see the difference BUT if you use OPENGL application well its another level because in the OPENGL 3d world geforce its garbage in performance compared to the quadro soo if you want really good performance and you have the money and use opengl application like maya buy quadro


#3

Hard to say if you don’t post your current graphics card specs…
From the performance side the 690 should be good.

The quadro/geforce discussion is another thing, but I for myself decided to never ever buy a quadro again. I use Maya the whole day long with CAD data and other stuff. Currently on a Geforce GTX 660 Ti with no problems at all. Despite of some UI glitches I mentioned in another thread. But I think are not related to the graphics card, though.


#4

well he ask if he sould buy geforce 690 when even quadro 2000 will kill in preformace the geforce 690 . and if he upgrade to geforce 690 he will not notice a big change but to quadro yes. and thats the reason that i said that

it all depand on the size of the scene that he work.


#5

Quadro 2000 is crap, they’re way overpriced for what you get, low amount of memory and a much slower speed.

I would think, that probably Maya can’t take advantage of a GTX 690 because it is a dual-GPU card, 3ds Max definitely can’t, at least for the viewport, it would only be able to use one GPU.


#6

Thank you for this breaking Cg News topic.


#7

yes you right the only problem that the quadro 2000 have its memory but slower in maya ???

do you have quadro?? oo you look the benchmarks?? quadro 2000 in opengl is far far a way in preformance in maya check before you write wrong facts .

and yes its overprices you right but he said that we wont play games and quadro 690 its cost more than quadro 200 (you can check that) and yes you right its have low memory (just 1GB) .

the only thing that 690 faster then the quadro its directx application and games and cuda based application.

dont believe me . check it yourself.

AND WHY ITS IN THE NEWS TOPIC???


#8

you’ll get a lot of “general” opinions on this topic… but in reality, their is a huge dependency on the application and Maya being one that benefits greatly from Quadro hardware and driver optimizations.


#9

hey guys I’m scanning over nvidia/card threads in last 7 or 8 months to get a sense of what card to get in my new rig :

[font=Trebuchet MS]Trying to decide between Geforce GTX 660 (w/2mg) or the low-end Quadro 2000 (can’t afford hi-priced Quadro above that).

(going to start my own thread but you guys know alot more about this than me, so in case ur still checkin this/getting updates from this thread// i thought i’d put this question here also)

i used to work in 3d modeling & lighting [mostly assets for games but some medical illustrations & 2d art stuff, etc],
these days my processing // GPU needs are different:
i’m getting back into 3d but not in large-volume production,
just for my own art & experimenting w/ models for 3d-printing----so my poly counts/geometry
could get large but probly not “huge”-- nor any big worlds or
heavy-production-deadlines.

I used to use old version of 3dsMax, so i’ll likley be getting into Max, modo, zBrush, possibly C4d (i liked using their 3d-paint tools in the past a bit),
and Ps for general paint-- but no serious video editing.
and no gaming

So i’m going to have 2 [/font][font=Trebuchet MS]PCI-E slots on new ASUS mobo & perhaps add 2nd card yr or 2 in future.
((I’m not going to have SLI-
had it on old system but doubt i ever took advantage of it–seems it’s best 4 gaming & multiple-realtime-viewports from what i understand).

-On my budget, i’ve got an i5 w/ 8Gg sys ram on Win7 pro/64bit.

-[i just noticed CGTalk lists me as new member–i’m noob @hardware, yet i’ve been member of CGsociety for few years, just don’t post much, i guess it tallies-up posts so as to rank as new, mid or senior member? oh well. i’m just a geek.]
THANX for any advice // feedback… mucho appreciated.

[/font]


#10

Hey guys,

I need simple clean advice, everyone tells me different things. Basically, I want to get a second hand card.

GTX 690 $1,100 5GB
GTX Titan $1,250 6GB
(in my currency which is Singapore dollar SGD)

I use Maya2013 with Vray 2.3 and I want to use Vray RT for quick shader tweaking. I also need a good graphic card to handle the polycount in the viewport.

One of our other machines has a Quadro K5000. People tell me its all about the Ram and CUDA cores so essentially, Quadro is a waste of money. I should just get a GTX690 or even GTX680.

No intention to SLI anything btw even in the near future.

So which should I buy. Some reports keep comparing the Titan to the GTX 670 and say their on par but its cause of shitty drivers for the Titan for now.

I’d like to just pick one and go for it. Please chime in. Thanks.


#11

just get a geforce x60-x80 model. gen 4, 5 or 6… they are all crippled cards and they are slower than quadro’s in all aspects(Ogl, Ocl and cuda etc.) except for games. but are decent enough. Quadros are way to expensive in my opinion, but they are faster at what they do! way faster!.

x90 series are dual GPU cards… and they are not utilized by any of the 3D packages that exist today.

Beware that the generation 6 eg 680 cards are more crippled than gen 5 eg 580 cards when it comes to OpenCL.

Last but not least, if you just need fast viewport response and high compatibility, the 2nd gen(285) cards are wonderful cards and you can get them cheep, they are faster then any of the newer cards in OpenGL, especially in maya. (regular viewport/ not vp2.0 etc)


#12

The 690, compared to the 680, is a waste of money to begin with, lets start with that.
The titan is not a PoS card, although one could argue it’s not quite worth that amount of money, but it’s a luxury product. That said, the titan doesn’t suffer from some of the crippling issue the 6xx did with DP, but again, that seldom matters much for what you do.

Between a 690 and a titan, a titan, without a shadow of a doubt. Personally at this point, with the 7xx allegedly coming this quarter, I think you should really, REALLY consider waiting a few weeks, your titan might literally drop a third of the price overnight this month or the next.


#13

Sorry, but could please, please stop spreading these old myths?

Quadros are NOT way faster.
They really, really aren’t. A titan will blaze by any quadro priced time to time and half in practically any regard, and the Kepler generation of quadros is late to the game, overpriced, and has generally been received as underwhelming.


#14

I have worked with both cards… and still do from time to time… Yes, when people compare 580/680 with a lower end or old quadro they are underwhelmed. I ones did a comparison between a quadro 5000(work) and my GTX 285(home) and my current 570(home). The difference was like this: GTX 285 2.1 mill polys at about 15-16 simple shading
GTX 580 0.4 mill polys at 2-4 fps. Quadro 5000 40 mill polys at around 80 fps. I scaled the scene to match the capability of the cards, instead of just depend on fps alone. i don’t remember the exact numbers but that was the general behavior. The Quadros are faster or rather the Gforce cards are slower, when handling double sided lighting and return pixels in openGL.

Edit: This was tested in the Maya viewport only! and not vp2.0… Oh and no, I do not support the price of Quadros!

Here is a rather contrasty test, where they are comparing two cards side by side. a geforce 670 (a card on the heady side), and a Quadro 600 (A card in the very low end of Quadros) and the Quadro still eats the geforce in that particular task. http://www.youtube.com/watch?v=kl4yNCgD3iA


#15

So have I, for years, including literally side to side with a monitor switcher hopping between the two workstations (IE: one with a 580 and one with a 4k).

Yes, when people compare 580/680 with a lower end or old quadro they are underwhelmed. I ones did a comparison between a quadro 5000(work) and my GTX 285(home) and my current 570(home). The difference was like this: GTX 285 2.1 mill polys at about 15-16 simple shading
GTX 580 0.4 mill polys at 2-4 fps. Quadro 5000 40 mill polys at around 80 fps. I scaled the scene to match the capability of the cards, instead of just depend on fps alone. i don’t remember the exact numbers but that was the general behavior. The Quadros are faster or rather the Gforce cards are slower, when handling double sided lighting and return pixels in openGL.

See, that’s one of the worst ways to test a videocard you could possibly think of, not to mention rather dated.
That’s just not how the videocards or their drivers work, but anyway…

Here is a rather contrasty test, where they are comparing two cards side by side. a geforce 670 (a card on the heady side), and a Quadro 600 (A card in the very low end of Quadros) and the Quadro still eats the geforce in that particular task. http://www.youtube.com/watch?v=kl4yNCgD3iA

You are aware of the fact the 6xx is DP crippled, and therefore can be made to artificially perform horribly in some tests, right? The 580 will absolutely BLAZE by a 690 in example if you toss them on a DP Fast Fourier Transform that’s using DP.

The Titan doesn’t have the crippling, which lets us hope the 7xx, based on the same silicon, won’t either.

That’s also why many people consider the 2xx and the 5xx the best gaming card gens for 3D.

Regardless, let me re-state, no, quadros aren’t faster, they are exactly the same cards as the GTX, recently on lower clocks, with their on board id changed by a resistor (see resistor hack thread I posted where a 680 is turned into a k5000) to let drivers throttle features, and occasionally (depending on line up) some cores lasered out or in.

The Titan will smoke a k5000 in day to day use with Maya in my experience and that of others.

The generic statement “quadros are faster” is so fundamentally flawed when made as blanket statement it’s annoying beyond belief to see it constantly repeated by people barging in and out when it’s been disproven a ridiculous amount of times at this point.


#16

This conversation is going nowhere… Cause it seams to me, that you are basically saying the same as me on many points.

I know that the GPUs are identical for GTX and Quads. Only minor changes on the board determines the ID and which state the GPU will operate in and I know that the cards are crippled(especially the 6xx series), didn’t you read my posts?

The crippling is why they perform badly in Maya and many other applications, it seam to have problems with pixel readback and 2 sided lighting, among other important things, and
that should, by all means conclude that the gForces operate slower in maya, thus making the Quads faster!

You are right that in terms of spec, the GTX cards are faster cards…And this is why Nvidia should be sued, cause many people upgraded there geforce cards to ones with higher spec in all levels only to fin out that they where not available. And it didnt say so on the box!

I don’t care how i tested the card, it wasn’t an official test for anything… the test showed me that the gForce cards are crippled to only perform about 5-10 percent of there Quad counterparts, in professional applications.

In terms of DIY GTX 2 Quad… so far I have only heard that people lost there cards to it, and the ones that got it working for a little while, never got it to perform as the Quadro’s(missing features).


#17

If you read further up, the toss up was between a titan and a 690.

The titan doesn’t suffer for the same crippling.
On top of that, the DP crippling has hardly any effect on viewport, but that’s beside the point.

The blanket statement of “quadros are faster” is what I take issue with.
It’s not true in general terms, and it’s not true in the absolute as the top end of the GTX series (Titan, and most likely soon to be the 7xx which will probably not differ much) is ahead of the k5000 even in some of the peskier artificial tests that saw the 6xx gen kneeling down dying.

Again, blanketing out a “geforce test to 5% of quadros” is too generic, and more frequently untrue than true.


#18

There’s that blanket statement again :rolleyes: I dont work in Maya, but I bet it’s another story in Viewport 2.0, since it’s DX based. Max, which viewport is also DX based, is another “professional application” that doesnt benefit (in the vast majority of situations) from Quadros.


#19

Why did you pull that part out of context? read and follow my statements! I explained, it has to do with Features called by the applications non-specific to DX or oGL! Professional Applications such as Maya, MAX, XSI, CAD etc. Call up functions which are not used in games such as, Overlays, Pixel readback, DP rendering, 2 Sided lighting etc. this is where nvidia crippled the cards.

VP 2.0 is for visualizing,i t is way too flawed to work with! I do hope they find a way to make it work as the normal viewport, even make it the default!.. but so far, I consider it bells and whistles!

Now show me a test/bench where GTX is faster then Quads in Maya or MAX! as well as point me to the source of where your “Titan is not crippled” statement came from.

I have been working in the industry for about 25 years, guys I am not trying to fill your heads with BS. Please prove me wrong!, Tell me how I can get my GTX 570 to pull models with over 20 mill polygons at, at least 30-40 fps! Which I can do with a Quadro 5000.


#20

http://content.screencast.com/users/m0bus/folders/Jing/media/b2a9241b-e777-41cc-991b-6f12aaeb504d/sven_rig.png

http://screencast.com/t/mX8lLyujxCW