Quadro FX 1000 vs GeForce 6800 GT - my experience


#1

I took the plunge a few days ago and bought a GeForce 6800 GT to replace my Quadro FX 1000. I have shifted away from 3D work since I bought the Quadro so it just wasn’t worth the preformance hit in games, espeically the newer shader-based ones, to keep the Quadro. Knowing the lack of good compairsons between the two I decided to do some tests. Here are the numbers that matter to the 3D community:

Quadro FX 1000
3dsmax-03 Weighted Geometric Mean = 13.53
catia-01 Weighted Geometric Mean = 11.39
ensight-01 Weighted Geometric Mean = 6.889
light-07 Weighted Geometric Mean = 5.848
maya-01 Weighted Geometric Mean = 21.03
proe-03 Weighted Geometric Mean = 18.78
sw-01 Weighted Geometric Mean = 10.34
ugs-04 Weighted Geometric Mean = 13.93

GeForce 6800 GT
3dsmax-03 Weighted Geometric Mean = 15.07
catia-01 Weighted Geometric Mean = 8.658
ensight-01 Weighted Geometric Mean = 5.932
light-07 Weighted Geometric Mean = 7.230
maya-01 Weighted Geometric Mean = 17.75
proe-03 Weighted Geometric Mean = 12.42
sw-01 Weighted Geometric Mean = 10.24
ugs-04 Weighted Geometric Mean = 4.500

You can read my full review here:
http://www.solarflarestudios.com/forum/viewtopic.php?p=1016

Please feel free to leave any comments about this!


#2

It’s weird how the Quadro beats the Geforce 6800 in every catagory except Max and lightscape…seems to me that Max dosen’t use all the Open GL calls that Maya and XSI use, which I think that is what the MaxExtreme drivers are for.

Which brings up an important point, you never mentioned what drivers both cards were using?

Considering the Quadro 1000 is a bit dated now, it only uses a agp 4x bus and was replaced about a year ago with the 1100 (which has higher Spec numbers and a different GPU). Where the 6800 uses a 8xbus, it’s suprising that the 1000 gets over 4 fps faster in Maya which would be huge for animators and would improve even more with an AMD system.

You can get a quadro 1000 for about 299 american and a 6800 goes for about 399, if I didn’t play video games, the quadro would be the better buy.

Nice comparison, I think the numbers just confirm what Nvidia has been saying, Geforce for Games, Quadro for DCC applications.


#3

yeah opengl sucks for max.
maxtreme offers much, much better performance. the only reason for getting a quadro with max.


#4

Thanks for the review.
It’s great to read a review from someone who’s actually used both series of cards.


#5

Thanks Stephen for the comparison. Actually it consolidates my findings with a Quadro FX 2000 (softmodded on 45.28) to Geforce FX 5800 mode with 61.77 drivers. The only thing that I noticed a speed increase with is the Spec benchmark suite. Maya is running much faster with 61.77 drivers and no issues at all. So my system is kinda slow and outdated (XP 1800+ with 1 Gig on a VIA KT 266A Pro2) it could be that I´m completely CPU limited in much of the tests and applications. I find it really odd that the FX 1000 doesn´t give a better performance advantage over the 6800, btw 8x over 4x doesn´t change the picture that much, many comparisons showed that the available bandwidth 2.1 Gb/sec isn´t utilized by almost every application today. And Doom3 is nice, but I wish I had something like a nitrogen cooler inhouse so that I could oc the f*** outa my system, 800x600 sucks BIG TIME.

regards


#6

Just a small note, the AGP bus speed wont affect any frame rates, only texture and geometry transfers.


#7

That’s only true with small geometric scenes, with textures under a certain size for performance, like a video game.

When you have a 50,000 polygon object/character Rigged and deforming, with 40 megabyte textures, then it will affect the framerate.


#8

The bus speed determines how quickly that data can be sent to the card. Once the data is there then the bus bandwidth is hardly used at all, it will not affect the fps speeds you can get.

Its affect on the end user for video games is longer pauses when moving from one area of a level to another where new textures need to be loaded. Within 3D apps, a 2048 texture will give a second or so pause as it is sent to the 3D card. This amount of time time is insignificant to the time it takes to find and load the texture though.

If you do wish to test this then you can download and install rivatuner to manually select a bus speed. You will find that frame rates are not impacted at all, only inititial load times and even then it is marginal.

Oh, and a 50,000 poly object with a 40 meg texture is nothing, even a prehistoric geforce 2 card is able to display such an object faster than most monitor refresh rates.


#9

You fall into the same trap as most others who are so Bias in their opinion on video cards, you think that Video games and DCC applications use hardware the exact same way.

Well they don’t, all spec benchmark numbers prove that.

I am not going to argue with you, because it’s a waste of time.


#10

Thanks stephen, interesting results with maya.
I’ve been waiting a long time for someone to post a comparison.

and i’d just like to say:
Hardly a day goes by on cgtalk when there isnt a pissing contest between people and their graphics cards :rolleyes:.

I am not going to argue with you, because it’s a waste of time.
wise descision, life is too damn short and full of far more important things to argue about.


#11

Hah ! I sold my Qfx1000 about a week ago and I have a Gainward 6800GT Golden Sample since the 12th. I posted my results in both Maya and Cinema4D in that other thread (which probably is somewhere at page 4 about now). Anyway, the 6800GT beat the Quadro @ Cinebench and the Maya sphere test was pretty well too. When I still had the Quadro I started working on a high poly face model, then the GT came by post. The GT works that scene just as well as the Quadro.

Also, we must note a few things ;

The world = more than just USA. ; In Europe a QuadroFX1000 still costs about € 800 - 1100. A 6800 GT can be found for a mere € 430,-. Then there’s also the hardware accelerated MPEG2 chip on the 6800GT ; more video functionality for you, right out of the box. My Gainward GT also comes with dual DVI, just like the QFX1000. The GT has S-video out, on top of that. My GT doesn’t get nearly as hot as in Stephen’s review. I clocked it @ 400/1100 for 3D mode and 350/1000 for 2D mode. 2D mode has fans at 60% and 100% at 3D mode.

Temps are about 46 degrees celcius at idle. Add 3, when working in C4D. Add 10 to 12 when playing SplinterCell.

So, for me, the GT does all at C4D and Maya just as good as the Qfx1000.

Finally then, let’s NOT forget what the GT card does best ; gaming. Afterall, that IS one of the major reasons in the decission to switch to the GT, right ?

I never did the 3DMark03 with the QFX1000, but I take Stephen’s word on it that it got him 3000 points. So, I tried the GT at 3DMark03 at 400/1100, which are Gainward’s GT Golden Sample stock speeds ;
12165 points in 3DMark03, tadah ! :buttrock:

Try getting anywhere near that with a QuadroFX1000. Lol. Anyway, for both 3D modeling/animating and games I do not regret the switch to 6800GT.


#12

If you have the opportunity, try some 3DSMax SpecAPC comparisons between the 6800GT(OpenGL), Quadro(OpenGL) and Quadro(Maxtreme).

(Assuming you have access to 3DSMax, of course :slight_smile: )


#13

Ckerr812, my Quadro FX 1000 was 8x AGP. All of the ones that I have seen out there are 8x AGP.

Sorry about leaving out the drivers. I ran the QuadroFX with the 56 series drivers (I don’t remember the exact version) and the GeForce with the 61.77 drivers. I know, not exactly an even test however the 50 series drivers are basically the ones that are certified right now for most of the pro apps.

And you could get a Quadro FX 1000 used (sometimes new) off of e-bay for around $300, but if you are not into buying from e-bay they are still over $800 on most of the e-tailers that I have looked at.


#14

I’d love too, but the GeForce 6800GT replaced my Quadro. Right now the Quadro is sitting in a box, awaiting selling. I don’t much like switching the cards back and forth, that’s a lot of money waiting to get fried by static. I ran the Quadro tests before I took it out, knowing that a test of just the cards in an otherwise basically identical environment would be useful.


#15

Thank you stephen2002 for posting the results of the tests, also to Knotter8, I read also the other thread.

Anyone did try the new ATI’s cards too?


#16

No problems. I might have the opportunity to do the same shortly, so I’ll make sure I hit them with SpecAPC’s for Maya and Max as well as the viewperf tests.

Thanks for posting yours up anyway. The results there are interesting for some speculation all the same. :slight_smile:

[EDIT]

And for a laugh, here are some results from my aging box at home (AthlonXP 2600+ running at 11.5*200 = 2300MHz) with a GeForce4ti 4200:


  Run All Summary 
  
  ---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
  3dsmax-03 Weighted Geometric Mean =   9.065
  
  ---------- SUM_RESULTS\CATIA\SUMMARY.TXT
  catia-01 Weighted Geometric Mean =   7.808
  
  ---------- SUM_RESULTS\ENSIGHT\SUMMARY.TXT
  ensight-01 Weighted Geometric Mean =   4.808
  
  ---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
  light-07 Weighted Geometric Mean =   9.231
  
  ---------- SUM_RESULTS\MAYA\SUMMARY.TXT
  maya-01 Weighted Geometric Mean =   19.20
  
  ---------- SUM_RESULTS\PROE\SUMMARY.TXT
  proe-03 Weighted Geometric Mean =   10.30
  
  ---------- SUM_RESULTS\SW\SUMMARY.TXT
  sw-01 Weighted Geometric Mean =   5.136
  
  ---------- SUM_RESULTS\UGS\SUMMARY.TXT
  ugs-04 Weighted Geometric Mean =   2.947
  

[EDIT2]

Woo! Look at those lightwave and Maya scores! The old AthlonXP keeping up with the P4 there (and even beating it at times), even with the GeForce4 “handicap”. :slight_smile:


#17

Yea, thing to remember that these benchmarks are how well your whole system runs, not just the video card, as how all benchmarks that test DCC applications should be, because there not as reliant on video cards as video games.

I know Maya is perfectly happy with any video card really, but there is a speed difference.

Like a score of 15.78 to 18.71 is a HUGE difference in spec, more then the benchmark would indicate really.

Once you start running all these tests, it gets addicting :smiley: You learn a lot about how certain Api’s and architectures of programs use hardware more efficiently then others, and how some programs don’t even use the extra open Gl calls on the quadros. Also you find out that Maya really loves AMD processors :slight_smile:


#18

Ah, ok. The older Quadro 1000’s are 4x agp, when they first came out.

Truth of the matter, about the drivers, I am using 65.64 drivers that are suppose to be a nice open GL upgrade, and I get a nice improvment on the quadros with these in Spec, also I noticed better performance with a 6800 GT in Doom as well.

If you could download these drivers on the net (there floating around), it’s worth it.

Remember there beta drivers though, so use at own risk.


#19

Thanks for taking the time to run those tests stephen! Even if its only a few tests, its still more data then we had before :).

Now if only those 6800 ultra’s would drop below 400… So hard to spend 500-600 bucks on a farking gaming card…even if it is redonkulously fast.


#20

This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.