Ok, I thought I'd write this little thing on my findings re: nVidia drivers and XSI 2.03.
The system I've used for this is an XP1700+, 512 PC2100 DDR and a Leadtek GeForce4 TD 4600 (MyVIVO). This is on W2K SP3, but similar results happen on XP (tho this card is faster on W2K than XP by about 3%... at least for me). Also using "4in1440v(a)p3)" as far as VIA drivers go. I'm also using dual 17" Monitors (Monitor 2 is via a DVI-VGA adaptor) in 2560x960x32bit (1280x960x32bitx2) screenres.
Previous to this, there was a Palit Daytone GeForce2MX (original MX before the 400/200/100 split happened).
First off, the scene I used for testing was something I did a little while ago consisting of just over 375k faces, render hairs displayed and smoothing on all objects set to 2 (Geometry Approx).
Now, with the GF2MX, with the full scene in a Camera viewport, I could rotate the scene at about 12fps in Textured view and 22fps in Wireframe. After upgrading to the GF4TD4600, there was no apparent change to this! (VERY interesting)
That aside, my main interest here isn't to do with that scene at all...
Now, this GF4td4600 came with Detonator 29.42 (which is probably very common) but like all good boys, first thing I did was grab Det 30.82 and Det 40.41 from nvidia.com (actually, I did that the night before the card arrived so I'd be ready).
And here's where the most interesting part comes in...
First, I install Det 40.41. Fire up XSI and all's peachy... or so I thought! I'd heard about high CPU usage when using 40.41 so I immediately fire up the Task Manager. I do a couple of things and everything seems fine.... until I hit 7!
For some odd reason, moving any node(s) around in Rendertree causes 100% CPU usage. I later discover even dragging a selection rectangle out in an EMPTY rendertree causes 100% CPU load!
I check out FX tree and Schematic view and this isn't the case. In those, I get maybe 10% usage (same with editing UVs in Texture Editor).
Very interesting I'd say! During my couple of days of testing, I did fresh installs of both W2K and XP a couple of times and the story was the same.
Well, Let's drop back to Det 30.82... The story is very different here. Rendertree uses maybe 10%. I also tested 29.42 and all's well there.
I then install 31.00 and get mixed results.
I load my custom layout for testing which consists of the second Monitor containing just a viewport split in half horizontally with Rendertree at the top and nothing at the bottom. The VERY odd thing here is, if I also open a rendertree on the left Monitor, that uses low CPU but Rendertree on the second Monitor uses 100%. That is until I remove the split. CPU usage then drops to what the other rendertree is getting (I don't see this as a problem on Monitor 2. I just happened to have the layout set up that way. I have no doubt if I switched the layout over I'd be getting 100% usage on Monitor 1 and low usage on Monitor 2).
However! With Detonator 30.82 (and 29.42), if I switch to using Dualview instead of nView for my dual Monitor pleasure, CPU usage is 100% again (i.e. CPU usage is low with nView and high with Dualview).
All of this isn't just limited to Rendertree, btw. I also get the same results scrubbling along the timeline in the Animation mixer (and I have no doubt it's the case with other areas of XSI).
As a side note, XSI is the only thing I've seen this with so far. I get 10,950 in 3dmark2001, 132fps in 1600x1200x32 with all settings maxed running timedemos of demo001 in Quake3Arena (that's with Det 40.41. I get 10,250 and 130fps using Det 30.82). As it stands at the moment, this appears more likely to be a problem on XSI's side, not nVidia's.
Anyway, these are my findings so far. Thought it might be worth sharing to see if others are getting similar results.