PDA

View Full Version : 6800GT slower than software acceleration in max?


PFS
10-17-2004, 02:47 AM
Pardon the noob question...I've done loads of searching and can't find a straight up answer.

I just built my first computer for the purposes of doing some decent 3D work. I don't have the cash now for a workstation card, so I thought a decent gamer card would do for now. I got a BFG 6800GT.

I just gave it a test in doom3....all good there. :) however, I've just it in max, and it goes slower than if I'm just using software acceleration.

Now, I've done my reading and I know not to expect workstation performance from a gamer card (even though some do ;) ), but I've always figured that if there even IS a debate going on, a 6800GT would have SOMETHING going for it in viewport performance.

Does anyone have any advice here? Are there drivers I should check out? Options I should change? Or is gamer card performance just that bad? I have a really hard time believing it is....

If anyone can offer any help, I'd be deeply appreciative. Cheers :beer:

DanSilverman
10-17-2004, 03:32 AM
To be honest, I have never used a "workstation" video card and I have always used a "gamer" card. I have a GeForce 6800 Ultra installed on my workstation and the speed at which it renders in MAX is incredible, so I would think the problem is not the card, per se.

The first thing I would do is download and install the latest drivers from nVidia. My card is from ASUS and the drivers that came on the CD caused me all sorts of problems. Every time I ran MAX and then closed the application, my screen would go nuts and I would have to reboot. I dumped those drivers and went with the ones from nVidia and my problems were solved.

There can be other issues as well. What sort of a PC do you have (CPU, how much memory, motherboard, etc)? Are you running your screen at 32-bit resolution? What sort of processes are running in the background? MAX can be a very intensive application.

Knotter8
10-17-2004, 04:39 PM
Yeah, no problems here with my 6800GT either. It beats a QuadroFX1000 at Cinema4D and it's about on par with it in Maya 5.

I suggest you wipe all old Forceware registry files with Detonator RiP or DriverCleaner3.2, then clean install the latest Forceware. Also, don't forget to install DirectX9.0C if you hadn't yet.

evo_supra
10-17-2004, 04:44 PM
Anyone know how the 6800GT will cope with 3dsmax software?

heavyness
10-17-2004, 06:51 PM
i guessing you tried both OpenGl and DirectX within max? like said above, reinstall your video drivers. look for a program called 'nasty file remover.' this makes sure there isn't any left over driver files left over that might screw up the new installation.

Lord Banshee
10-17-2004, 08:09 PM
I have found that it is impossible to run max in OpenGL with an Nvidia game card. It seems that Nvidia limits it power ALOT in 3dsmax. I guess to make the Quadro look better then they are. But switch it to DirectX9 and you will be amazed at the speed.

I heard XSI, Maya, and Cinima 4D has no problem with this card and many of Nvidia card with OpenGL Acceleration. Am i the only one that thinks this is an unfair way to promote thier Workstation cards??

PFS
10-18-2004, 12:04 AM
Thanks, everyone, for the replies.

Dan: I'm running dual opteron 242s on an MSI K8T Master2 board, with a gig of ram and 32-bit colour depth.

Knotter8 & kole: I quickly tried updating my driver to the lastest nVidia one, to no avail. I'm in the middle of midterms right now, so I don't have time to go do an intense driver cleanup, but I will follow that advice as soon as I have a bit more time. Cheers. :) Hopefully that will help out.

Lord Banshee: I switched to DirectX, and WOW, are you ever right. Suddenly I was pushing a million polys with zero lag. However, direct x 9 isn't sitting right with my machine for some reason. I had to revert to 8.1 in max, and in xsi, the direct9 window simply didn't draw anything. But I'm sure I can sort that out. It would be too bad if swtiching to DirectX is the permanent fix for this problem; I went nVidia because I heard that they did better openGL than ATI.

Is that true, that they slow down openGL for their gamer cards? I think that would defintely be unfair business practice. What evidence is there to back that up? I'm suprised the graphics community would them get away with it. And is that how one makes a softquadro, by somehow removing the openGL limit? I've heard about softquadding a lot in my reading.

Hugh-Jass
10-18-2004, 12:22 AM
Have never had any problems with Open GL in 3dsmax, I prefer it over DirectX...

I 've used gamer and workstation cards..there is also a "maxtreme driver" on Nvidia's site

You should definitely uninstall drivers AND switch to software in 3dsmax
save a file close max.
Then install your drivers...
switch to open gl in max save a file quit restart max.
I've had it not take changes unless you force it out of a video driver and back.

Lord Banshee
10-18-2004, 12:29 AM
Have never had any problems with Open GL in 3dsmax, I prefer it over DirectX...

I 've used gamer and workstation cards..there is also a "maxtreme driver" on Nvidia's site

You should definitely uninstall drivers AND switch to software in 3dsmax
save a file close max.
Then install your drivers...
switch to open gl in max save a file quit restart max.
I've had it not take changes unless you force it out of a video driver and back.
I used to be the same way about OpenGL Then when i upgrade from my Geforce 3 to 5900XT i had even better OpenGL profromance but when i had a model at sub-d iteration 2 it would be laggy. But in DirectX 9 It is 10x faster and i am not lieing one bit. I think it is stupied as the 5900XT is suppose to have CRAPPY DirectX9 profromance and GREAT OpenGL Profromance.... What the hell happen.... I have nothing to back it up but how in the hell can you explain some of the benchmarks in specview 5900xt vs 6800Gt is barely a difference.... Umm there should be A HUGE difference.

Hugh-Jass
10-18-2004, 12:37 AM
I should clarify my previous statement...I prefer openGL in versions of max earlier than max6.
Many of the pipelines I have to use , use older versions of max and direct X has some issues displaying verts in physique and doesn't cull wireframes...

DanSilverman
10-18-2004, 05:24 AM
At one time or another I have owned most types of nVidia based cards (TNT2, GeForce2, 3, 5600 and now the 6800 Ultra). In all cases I have run MAX in OpenGL and have had no problems with it at all.

PFS
10-18-2004, 07:29 AM
This is all quite odd. Dan, if you've got like three mintues sometime, can I send you a file and you tell me what kind of frame rates you get when you manipulate the viewport? If would be interesting to get rough bench with a system that's working properly. Drop me a line at paddy at shadnet dot shad dot ca if that would be okay. Thanks! :)

DanSilverman
10-18-2004, 03:25 PM
I would have no problem doing that, but I am in visiting in the USA for two more weeks and thus I am on my laptop. My GeForce 6800 Ultra is on my workstation at home. So, if you can wait a couple of weeks then I will be more than happy to.

PFS
10-18-2004, 05:42 PM
It'll probably be that long until I get over this influx of school work and have time to really look at the problem, so sounds great. Many cheers Dan. :)

Lord Banshee
10-19-2004, 06:27 AM
Just read this benchmark of the n-force4 well to my suprize they ran SpecViewPref 7.1 with the 6800 Ultra..

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2248&p=12

The 6800 Ultra on a machine with alot more CPU power (Athlon FX 55)

seems like not much gain in 3dsmax... or UGS... not basicly any of them.

I am going to run my specview again tonight on my 5900XT i have version (7.1) and you can decide what to make of it.)

But my last scores were the following in my 2800 at stock speeds.

Run All Summary
---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-02 Weighted Geometric Mean = 14.15
---------- SUM_RESULTS\DRV\SUMMARY.TXT
drv-09 Weighted Geometric Mean = 75.71
---------- SUM_RESULTS\DX\SUMMARY.TXT
dx-08 Weighted Geometric Mean = 74.04
---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-06 Weighted Geometric Mean = 14.44
---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-02 Weighted Geometric Mean = 18.66
---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-03 Weighted Geometric Mean = 7.761

I'll post again tomorrow with my overclock 2800 (500+Mhz)


Here is another site that ran a 6800GT. It got better scores then the one above(i think the one above was running 7.1 and not 7.1.1)
http://www.hardtecs4u.de/?id=1091231009,21620,ht4u.php

My computer crashed last night not sure what happen OS is done for. So i have to reformat and i will get those benchmarks as soon as i can. In the mean time if someone here has a 6800GT and Spec viewpref 7.1.1 please run it and post your results. I am truelly thinking about getting a ATI card next time around because of these werid results nvidia is getting.

raz-0
10-19-2004, 07:46 PM
one thing to remember people, from the 5800/5900 series of cards to the 6800 series of cards, most of the improvement will not affect basic DX7/8 performance or basica OGL performance.

The vast improvement was in pixel shaders and vector shaders as well as advanced stencil shadow functions and junk like that. Most of which are meaningless in most 3d apps at the moment. (unless supporting special plug-ins for doing game asset development).

I ahd a 5800 ultra and a 6800GT. the GT is on par with the 5800 ultra in specbench stuff and max OGL tests on the same machine. Both ARE definitely faster than software mode. directX mode however is blazingly fast except for the initial lag when manipulating stuff in the viewport.

Lord Banshee
10-20-2004, 02:59 AM
one thing to remember people, from the 5800/5900 series of cards to the 6800 series of cards, most of the improvement will not affect basic DX7/8 performance or basica OGL performance.

The vast improvement was in pixel shaders and vector shaders as well as advanced stencil shadow functions and junk like that. Most of which are meaningless in most 3d apps at the moment. (unless supporting special plug-ins for doing game asset development).

I ahd a 5800 ultra and a 6800GT. the GT is on par with the 5800 ultra in specbench stuff and max OGL tests on the same machine. Both ARE definitely faster than software mode. directX mode however is blazingly fast except for the initial lag when manipulating stuff in the viewport.


Hmmm good points, but Nvidia and ATI does advertise more polys/sec but not much of a difference for what they advertise.

Another thing that bothers me that ATI cards do alot better in UGS benchmark which i don't understand when that is a purely a OpenGL benchmark. Maybe you can help with that.

I am not saying i don't like my 5900XT but i just think it is werid..... And i would like to know exactly whats disabled from the gamer card to make the quadros so fast in certain programs.

Here is my spec viewpref 7.1.1 score with my overclock AMD64 2800+ @ 2.3Ghz
Run All Summary

---------- SUM_RESULTS\3DSMAX\SUMMARY.TXT
3dsmax-02 Weighted Geometric Mean = 16.47

---------- SUM_RESULTS\DRV\SUMMARY.TXT
drv-09 Weighted Geometric Mean = 84.80

---------- SUM_RESULTS\DX\SUMMARY.TXT
dx-08 Weighted Geometric Mean = 88.23

---------- SUM_RESULTS\LIGHT\SUMMARY.TXT
light-06 Weighted Geometric Mean = 17.89

---------- SUM_RESULTS\PROE\SUMMARY.TXT
proe-02 Weighted Geometric Mean = 21.01

---------- SUM_RESULTS\UGS\SUMMARY.TXT
ugs-03 Weighted Geometric Mean = 7.919

Oh both results was done in win2k pro and 61.77 drivers

raz-0
10-20-2004, 08:42 PM
what's disabled are the special application level drivers. MAXtreme for 3dsMAX fro example. Allso, there are some things locked in the driver or on chip like AA wireframes and such when you are talking geforce vs. Quadro.

To know why UGS tests would run faster on ATI, I'd have to know something about how unigraphics does stuff. Which I don't. I pretty much use MAX and not much else.

As for the polygons/sec, that is a matter of fillrate for the card. gaming cards have always had good fillrate, but it isn't necesarily what a 3dapp taxes.

your specviewperf results look about right for your CPU and video card. Other than that, they don't mean much. What means more is actual performance in the app of your choosing in the scenes you need to work with. AT that point, you need to know what hte bottle neck is for what you are experiencing problems with. I've seen a number of postings on various forums where someone plonked down the cash for a quadro 3000 expecting their super desne scenes to fly finally. Had they asked around, the y were pushing the bounds of the software or CPU rather than the video card. The video card wouldn't help do the math for estimating the dynamics that was slowing stuff down, and wouldn't make the software any better at handling scenes with nearly a gigabyte of model data in them.

CGTalk Moderation
01-19-2006, 02:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.