PDA

View Full Version : GF FX Quadro vs Wildcat 4


WhiteRabbitObj
01-22-2003, 05:42 AM
Can anyone give me a good comparison between the Wildcat 4 and the GeForce FX Quadro cards?

I am going to be purchasing some hardware for a workstation soon and though I own stock in nVidia, so I would normally want to buy their products, I want to get the absolute fastest card. I know the Wildcat is much more expensive, but I don't have much idea how it compares in real word tests and how much it is or is not limited by CPU speeds (as in, it has to wait for the CPU to keep up sometimes).

Any thoughts on which card is better and why are welcome!

Cararan
01-22-2003, 08:00 AM
Hold out for the Quadro FX 2000 seem like this is going to be a very promising card.

aYs
01-22-2003, 11:59 AM
there is a test comparing wildcats, quadros and firegls
maybe i'll find it

anyway, the wildcats are pure CAD cards and bad 2D performance and not that good in OpenGL 3d apps like LW, Maya...
(it's more a product for engineers and not for vfx artists)

the quadro is the best allround card you can get with great 3D and 2D performance

well, just my 2 cents

bye

David Watters
01-22-2003, 03:21 PM

GregHess
01-22-2003, 04:02 PM
Does any one have anything I am overlooking were wildcat 4 still provides a benefit?

Yes, but its at home. I'll post later today. I have a few clients who are still going with Wildcat for various reasons.

Nice 3dmax scores on the FX...I don't know the additional 700 USD warrants the 10% performance difference. I guess we'll have to wait to see realworld scores.

David Watters
01-22-2003, 04:06 PM
Originally posted by GregHess
[I]Yes, but its at home. I'll post later today. I have a few clients who are still going with Wildcat for various reasons.


Please do.

Thanks,
David

David Watters
01-22-2003, 04:21 PM

dmeyer
01-22-2003, 06:33 PM
Just to play devil's advocate a bit...

I've a Quadro750 at home and a Wildcat6110 at work (not the same card as you are asking but just for the sake of argument)....and I've noticed that the Wildcat taxes the CPU a lot less. And the speed seems in general seems about the same, unless i am working a riciulouly large model, then the 6110 feels faster. These are purely subjective views on feel (and the process manager), but just a point to make.

For my money though I'd still be looking at Quadro, unless you run SolidWorks or even StudioTools all day long.

StefanDidak
01-22-2003, 07:59 PM
Originally posted by David Watters
Does any one have anything I am overlooking were wildcat 4 still provides a benefit?

If you really want to know.......

We have been sticking with 5110/6110/etc. for a variety of reasons. Excellent always-accurate polygon clipping, high accuracy at an acceptable speed trade-off, superb anti-aliasing, stable drivers and great dev-support. An overall "cleaner" looking result in realtime images, especially when realtime encoded from an OpenGL window to interlaced CCIR standardized broadcast output. Oh, and the fact that through the IO and hardware architecture we can pump two streams of live HD as textures which are then used in conjunction with live broadcast sources.

Those are just a few things that are on the top of my mind, for further details I'd have to check with my engineers who are more knowledgable about it at the bit-and-byte levels.

That's not to say nVidia's products are inferior or bad but there are definite differences as much as there are similarities where the WC range of products will do that extra bit for high-end mission-critical realtime systems where you absolutely have to depend on stability and image quality and where 'cool realtime shader tricks' just don't matter all that much. It's a different, and perhaps niche, market, than what nVidia has been aiming for even with the Quadro line of products which are all more aimed at the DCC and workstation systems. Therefor you can try and compare the boards, their features, etc. but unless you put in the context of the different markets it means very little.

I hope that satisfies your question a little. :)

StefanDidak
01-22-2003, 08:04 PM
Originally posted by dmeyer
Just to play devil's advocate a bit...

I've a Quadro750 at home and a Wildcat6110 at work (not the same card as you are asking but just for the sake of argument)....and I've noticed that the Wildcat taxes the CPU a lot less.

That is indeed another thing that's very noticable. The taxation on the CPU on the 51xx/61xx series is a lot less than pretty much any other similarly performing (or claiming to perform) card. We've used that to our advantage in realtime 3D systems for a while now where dual Xeon boxes are pushed to the limit at the CPU level while still remaining 100% sync with the display output and texture feeds. We've done an awful lot of custom synchronization in the code and applications to take advantage of it because timing has been essential and so far no other card has come close to still being able to deliver the same performance under the same conditions.

GregHess
01-22-2003, 09:36 PM
Dave,

Here we go. In order of importance...

1) When using 3dsmax, without the use of maxtreme, there is substaintally less gain in performance between a quadro and their consumer equivilants, the geforce line.

This issue doesn't seem very big at first, as why would anyone not use maxtreme? Well thats pretty easy...it didn't work. For a good portion of time after the release of max5, there were no working versions of maxtreme completely compatible with max5. Yes there were some that "worked" but working in a consumer sense and a workstation sense are completely different catagories. [Working in a workstation sense means...zero stability issues, zero flaws, zero bugs, and zero problems. This was not available for many months after the max5 release]

Because of this software delay, (One can blame it on a # of sources, but regardless it was delayed) many max users are now wary to invest in such an expensive solution that won't really benefit their app of choice. (They're afraid it'll just happen again). This may not result in them going to a wildcat or ATI solution...but it makes them far more likely to go to the much cheaper consumer line.

2) High poly scenes. On some scene sets (Usually those of massive size...2 million+ poly's, few hundred meg's) the wildcat's tend to perform better. When looking at something like a 980XGL (I haven't seen the FX) the Wildcat's performance curve drops off much slower. Though the nvidia solution might be superior in performance, this edge tends to drop off when specific scenes get overly large.

3) Antialiasing. Though once again, a small percentage, visualization teams sometimes go for the clean sharp look of the WC's when displaying a radosity solution to a client.

Thats a few of them.

dvornik
01-24-2003, 05:31 AM
Thanks Greg for mentioning the Maxtreme issue. We would never put consumer-level cards into our main workstations and we are looking at other pro cards for our next purchase. We are a DTC and 3ds max performance is a rather major factor for us.

I have talked to Boxx about the issue and they've sent me to PNY. I filed a formal bug report at PNY and the guy was not very optimistic on when and if it's going to get fixed. He advised not to use it.

GregHess
01-24-2003, 12:07 PM
dvornik,

Did you talk to boxx about possibly replacing the quadro's? They might do a swap for an equivilantly priced, or slightly more expensive card in the workstations.

I hope nvidia gets their act together on making a 100% working maxtreme. It might be discreet's fault due to changes in the max viewports, but regardless it needs to be fixed if these cards are going to be sold to max houses.

David Watters
01-24-2003, 08:27 PM
dvornik,

I have escalated the MAXtreme questions and issues. I should have some sort of feedback for you shortly.

elvis
01-25-2003, 07:16 AM
to add to the maxtreme discussion:

up until friday last week, my 3dsmax guys were working flawlessly with 3dsmax 5.1 and the latest maxtreme drivers on two dell systems with a quadro4 900XGL and a quadro2 mxr.

friday afternoon it all went bad. two files in particular would crash at random intervals when objects were selected (no single element in particular, just pick something and boom, it's crashed).

this went on for a few hours until i swapped them all over back to the openGL, at which time the problems dissapeared.

until now i've been quite pleased with the MAXtreme stability, but it seems this is no more.

i hope NVIDIA put the MAXtreme drivers on their list of priorities just as they would their detonators. these drivers are the biggest selling point of NVIDIA's quadro cards to MAX users.

Flyby
01-25-2003, 08:16 AM
It is obvious that if you want to market an professional GFXcard at a higher price tag, you have to be able to rely on ROCK SOLID drivers.
Most 3dsmax professionals here will not accept paying for something that doesn't live up their expectations in terms of performance and stability.
Nvidia is apparently only capable of delivering the first part of this equation, as they are able to manufacture high performance GPU's. However, as long they don't pay enough attention to the second part, the Quadro line will continue to have a bad aura as there performance rest partially on the (buggy) maxtreme drivers...

I personally got burnt on the maxtreme drivers way back when ELSA was still in charge. I got almost tempted to jump on the new quadro line when Nvidia took over after ELSA went belly up, but hélas, it turned out that Nvidia support for the maxtreme driver was on par with ELSA's. NO good thus.. SO I went for the "consumer" range of cards instead, using OGL...

So, David, if Nvidia wants to sell their Quadro line, you guys need to put more effort into the maxtreme drivers and rebuild their reputation by making them more stable then what currently is the case. Fixing drivers 6 or 9 months after a board hits the street , is NOT an option. It only generates more frustration and another unhappy customer that will spread the word how bad the maxtreme support is...
:cry:

3danim8d
01-26-2003, 02:37 AM
David Watters,

Since we have your ear here, when can we expect some updated drivers for the GF4 ti4600 cards that will support MicroFlotsam's Direct X 9 ?


Thanks,

elvis
01-26-2003, 05:07 AM
directx9 capable DX9 detonator drivers are already out and on NVIDIA's website.

the geforce4 series of cards cannot support directx9 FULLY in hardware due to their technical limitations. this does not mean that DX9 software will not run on them however. (DX8 software still runs on "DX7" cards like the geforce2 and gf4mx, but is not fully hardware accellerated).

GregHess
01-26-2003, 01:35 PM
A quick note to this....

The only dx9 capable nvidia cards will be produced directly by nvidia. The secondary parts NV31 and NV34 will not support dx9. Be very careful when buying new nvidia accelerators.

http://www.xbitlabs.com/news/story.html?id=1042994509

This of course has not been confirmed.

CgFX
01-26-2003, 07:58 PM
Originally posted by GregHess
A quick note to this....

The only dx9 capable nvidia cards will be produced directly by nvidia. The secondary parts NV31 and NV34 will not support dx9. Be very careful when buying new nvidia accelerators.

http://www.xbitlabs.com/news/story.html?id=1042994509

This of course has not been confirmed.

No where in the article do I see anything that supports your hard claim that they will not support dx9. There are some assumptions made in the article on what my be done to make these cheaper parts but I still don't see how you can make a claim like the above based on that article.

I would assume that any NV3x chip would be related to NV30 (as it has always been in the past) and that many things can be done to make the chip cheaper while keeping part if not all of the DX9 compatibility requirements.

GregHess
01-26-2003, 09:33 PM
I guess time will tell which one of us is right. At least I won't feel like ass for treating individuals with disrespect if I turn out to be wrong.



CgFx,

Thank you for editing your post towards a more..objective approach. I agree I was a bit forward in my comments, but I was also hoping to gander a response from Mr. David, who hasn't yet commented on all the Wildcat vs Quadro threads.

dmeyer
01-26-2003, 11:14 PM
I think some folks take hardware a little too seriously. :rolleyes:

CGTalk Moderation
01-14-2006, 05:00 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.