Lightwave and hardware


#1

Hey everyone, newbie here, so be gentle. :wink:

I’ve been using Lightwave off and on since version 4, but only recently have gotten seriously back into it. I’m using 7.5c and anxiously awaiting the soon to be released 8, but I digress…

Question I have is with video cards. Soon I’ll be in the market for a new machine and I’m more then likely going to go with a dual Xeon system after PCI-E goes mainstream. Now, I read somewhere else someone (forgive me, but I forgot who) said that Lightwave doesn’t take advantage of OpenGL or DX enough to reap the benefits of a professional videocard like the Quadro or Fire cards, and that a high end gaming card like the nVidia 5900 or ATi 9800 (or upcoming nVidia 6800) card would be fine. Comments?

I’m not a professional, so cost is an option, but I am planning on finally getting my rear in gear and putting together a demo reel and trying to go mainstream… along with a few thousand other people right? :slight_smile:

Anyway, thanks in advance!

Brian


#2

I’m a modeller mainly, and I use an old Geforce 4 TI … oh I wanna say 4600 … Been a while since I’ve checked hehe. I’m still happy with it. LW doesn’t take as much advantage as it could. A faster card might make rotating the model around to look at it smoother, but when you do your basic stuff like rotating or subtracting out stuff, it’s cpu based at that point. I’m not sure having a faster card would be much more beneficial for me.

That’s just me, i’m sure others will chime in with interesting points of view.


#3

Most of the rendering happens at the processor end. It is just the eye candy that get assembled at the end where the video card comes in. Sooo… once again from my understanding, the new video cards(6800) will help in some aspects maybe a little better quality but once again the processor is where most of it happens. Like Fprime, it uses one processor not the video card to make it’s image, although it does send a nice render there. :slight_smile:

threw my two cents in now running like hell


#4

Funnily enough, my old GeForce 4 Ti4200 felt snappier in LightWave than my current GeForce FX 5700 Ultra…


#5

I have a gforce4 ti 4400 and I have had no problems with it. But then again I wouldn’t mind finding a 6800 on my front porch tomorrow. :slight_smile:


#6

Thank you all for your responses! This does help me re-affirm my belief in investing more in the CPU’s then the GPU. Now if I could just afford a dual Itanium 2… :thumbsup:

Brian


#7

I like my Dual Athlon, but don’t skimp on cooling.


#8

My old machine has TI 4400 and this new one a quadro 4, there is a noticable benefit in polygon moving with the quadro, but little else. If I didn’t pick up the quadro cheap off e-bay I would have been gutted to pay top $ for it.

I say invest in Fprime or CPU power, rather than a high end card.

Cheers Rich.


#9

I am using an XFX nVidia GeForce 4 Ti4200 128MB card (on dual 19" monitors) that I paid $99 new for and love it. I just upgraded my machine from a P3 800Mhz w/ 512MB of RAM to a P4 3.0Ghz (w/ hyperthreading) w/ 2GBs of RAM and noticed a huge increase in response from Lightwave and the OpenGL redraw and how fast and easy I could rotate models, so I would say the video card had little to do with it, unless it was also running at say AGP 4x instead of 8x (old mobo didn’t have AGP 8x)…not sure. Anyways it seems the other hardware was holding back my video card and Lightwave, not the video card itself. From personal experience I would say stick with a consumer card as I have had no problems with them. No point spending $500 or more when a $99-150 card will do basically the same job for Lightwave.


#10

Hi Shaun,

I think we agree there.

It must have been like magic to experience new render speeds when you u/graded to the 3gig p4.

Rich


#11

Guys you are right to certian point, there is small difference in modeler speed when you change GFX card (som of them work faste rin wire mode) but AFAIK that’s because LW 7.5 supports OpenGL 1.1 and nothing above. BUT LW8 features have many new OpenGL features and they had whole tem on new OpenGL (i belive that guys were from Austria) so that OpenGL thing situation might be different in LW8. Some things are sure and LW8 now support alpha channels in OpenGL and procedurals which aren’t been supported before. So I’d suggest to see what LW8 offers and then choose GFX card.


#12

So hopefully my investment in a GeForce FX 5700 Ultra was a good one eh? It’s the only card so far as I know that currently supports OpenGL 1.5 (excluding the new GF 6800).


#13

Originally posted by Aegis Prime
So hopefully my investment in a GeForce FX 5700 Ultra was a good one eh? It’s the only card so far as I know that currently supports OpenGL 1.5 (excluding the new GF 6800).

That’s very nice card but i wouldn’t say it’s only one with current 1.5 support 'coz my Gainward 5950Ultra also support OpneGL 1.5 and i’m pretty sure there is few others also ;).


#14

More good information, and again, thank you.

I’m planning on picking up LW8 before my new machine anyway, so this delay to figure out what’s new and improved with LW’s OpenGL support works out perfectly.

Just kind of scary to think that my old POS MX 440 card now has the minimum memory requirements for LW8 at 64MB.


#15

Originally posted by Aegis Prime
So hopefully my investment in a GeForce FX 5700 Ultra was a good one eh? It’s the only card so far as I know that currently supports OpenGL 1.5 (excluding the new GF 6800).

1.5.0? Bah! :wink:


#16

Only a maximum texture size of 2048 there Para? :stuck_out_tongue:


#17

2048 should be enough for anyone :wink:


#18

And Bill Gates said that 640K of memory should be more then enough for anybody way back when too. :wink:


#19

2048 is plenty for what the 9600 is - a card made with only gaming in mind and i’ll be damned if there’s any game currently on the market that uses textures that large.


#20

2048 should be enough for anyone

LOL! Touché :applause: