PDA

View Full Version : Lightwave and hardware


PTG
04-15-2004, 11:57 PM
Hey everyone, newbie here, so be gentle. ;)

I've been using Lightwave off and on since version 4, but only recently have gotten seriously back into it. I'm using 7.5c and anxiously awaiting the soon to be released 8, but I digress...

Question I have is with video cards. Soon I'll be in the market for a new machine and I'm more then likely going to go with a dual Xeon system after PCI-E goes mainstream. Now, I read somewhere else someone (forgive me, but I forgot who) said that Lightwave doesn't take advantage of OpenGL or DX enough to reap the benefits of a professional videocard like the Quadro or Fire cards, and that a high end gaming card like the nVidia 5900 or ATi 9800 (or upcoming nVidia 6800) card would be fine. Comments?

I'm not a professional, so cost is an option, but I am planning on finally getting my rear in gear and putting together a demo reel and trying to go mainstream... along with a few thousand other people right? :)


Anyway, thanks in advance!

Brian

NanoGator
04-16-2004, 01:49 AM
I'm a modeller mainly, and I use an old Geforce 4 TI ... oh I wanna say 4600 .. Been a while since I've checked hehe. I'm still happy with it. LW doesn't take as much advantage as it could. A faster card might make rotating the model around to look at it smoother, but when you do your basic stuff like rotating or subtracting out stuff, it's cpu based at that point. I'm not sure having a faster card would be much more beneficial for me.

That's just me, i'm sure others will chime in with interesting points of view.

Spacemanbob
04-16-2004, 03:19 AM
Most of the rendering happens at the processor end. It is just the eye candy that get assembled at the end where the video card comes in. Sooo.. once again from my understanding, the new video cards(6800) will help in some aspects maybe a little better quality but once again the processor is where most of it happens. Like Fprime, it uses one processor not the video card to make it's image, although it does send a nice render there. :)

*threw my two cents in now running like hell*

Aegis Prime
04-16-2004, 05:33 AM
Funnily enough, my old GeForce 4 Ti4200 felt snappier in LightWave than my current GeForce FX 5700 Ultra...

Spacemanbob
04-16-2004, 12:00 PM
I have a gforce4 ti 4400 and I have had no problems with it. But then again I wouldn't mind finding a 6800 on my front porch tomorrow. :)

PTG
04-16-2004, 03:05 PM
Thank you all for your responses! This does help me re-affirm my belief in investing more in the CPU's then the GPU. Now if I could just afford a dual Itanium 2... :thumbsup:

Brian

NanoGator
04-17-2004, 02:44 AM
I like my Dual Athlon, but don't skimp on cooling.

Dickigeeza
04-17-2004, 06:27 PM
My old machine has TI 4400 and this new one a quadro 4, there is a noticable benefit in polygon moving with the quadro, but little else. If I didn't pick up the quadro cheap off e-bay I would have been gutted to pay top $ for it.

I say invest in Fprime or CPU power, rather than a high end card.

Cheers Rich.

LittleFenris
04-18-2004, 05:18 AM
I am using an XFX nVidia GeForce 4 Ti4200 128MB card (on dual 19" monitors) that I paid $99 new for and love it. I just upgraded my machine from a P3 800Mhz w/ 512MB of RAM to a P4 3.0Ghz (w/ hyperthreading) w/ 2GBs of RAM and noticed a huge increase in response from Lightwave and the OpenGL redraw and how fast and easy I could rotate models, so I would say the video card had little to do with it, unless it was also running at say AGP 4x instead of 8x (old mobo didn't have AGP 8x)...not sure. Anyways it seems the other hardware was holding back my video card and Lightwave, not the video card itself. From personal experience I would say stick with a consumer card as I have had no problems with them. No point spending $500 or more when a $99-150 card will do basically the same job for Lightwave.

Dickigeeza
04-18-2004, 08:22 AM
Hi Shaun,

I think we agree there.

It must have been like magic to experience new render speeds when you u/graded to the 3gig p4.

Rich

Lewis3D
04-18-2004, 12:07 PM
Guys you are right to certian point, there is small difference in modeler speed when you change GFX card (som of them work faste rin wire mode) but AFAIK that's because LW 7.5 supports OpenGL 1.1 and nothing above. BUT LW8 features have many new OpenGL features and they had whole tem on new OpenGL (i belive that guys were from Austria) so that OpenGL thing situation might be different in LW8. Some things are sure and LW8 now support alpha channels in OpenGL and procedurals which aren't been supported before. So I'd suggest to see what LW8 offers and then choose GFX card.

Aegis Prime
04-18-2004, 12:17 PM
So hopefully my investment in a GeForce FX 5700 Ultra was a good one eh? It's the only card so far as I know that currently supports OpenGL 1.5 (excluding the new GF 6800).

Lewis3D
04-18-2004, 12:35 PM
Originally posted by Aegis Prime
So hopefully my investment in a GeForce FX 5700 Ultra was a good one eh? It's the only card so far as I know that currently supports OpenGL 1.5 (excluding the new GF 6800).

That's very nice card but i wouldn't say it's only one with current 1.5 support 'coz my Gainward 5950Ultra also support OpneGL 1.5 and i'm pretty sure there is few others also ;).

PTG
04-18-2004, 02:10 PM
More good information, and again, thank you.

I'm planning on picking up LW8 before my new machine anyway, so this delay to figure out what's new and improved with LW's OpenGL support works out perfectly.

Just kind of scary to think that my old POS MX 440 card now has the minimum memory requirements for LW8 at 64MB.

Para
04-18-2004, 02:58 PM
Originally posted by Aegis Prime
So hopefully my investment in a GeForce FX 5700 Ultra was a good one eh? It's the only card so far as I know that currently supports OpenGL 1.5 (excluding the new GF 6800).

1.5.0? Bah! ;)

Aegis Prime
04-18-2004, 04:05 PM
Only a maximum texture size of 2048 there Para? :p

Para
04-18-2004, 04:08 PM
2048 should be enough for anyone ;)

PTG
04-18-2004, 07:22 PM
And Bill Gates said that 640K of memory should be more then enough for anybody way back when too. ;)

Kwago
04-18-2004, 08:12 PM
2048 is plenty for what the 9600 is - a card made with only gaming in mind and i'll be damned if there's any game currently on the market that uses textures that large.

Aegis Prime
04-18-2004, 08:39 PM
2048 should be enough for anyone

LOL! Touché :applause:

Beamtracer
04-18-2004, 10:08 PM
Originally posted by NanoGator
I like my Dual Athlon64-bit hardware sounds like the way to go these days. It won't become obsolete as fast.

alephnull
04-20-2004, 09:18 AM
let me back myself up here. i do some work for a company that sells a lot of pro and consumer graphics gear and get to play with the really expensive stuff (infinite reality) and pit it against the run-of-the-mill (geforce, nvidia, wildcat and so on) using apps like maya, max, and LW and all that [whatever program runs on whatever system all get run through the ringer]. depending on what sort of work you want to do and how much money you have on hand, being able to preview things like particle effects can be greatly enhanced by a good pro gpu. on the other hand, test renders on a fast cpu can show results in the finished look but being able to get a sense of timing before you commit to a rendered sequence may be important to weigh out. ik and all of that works faster in openGL with a good card but whats 'good'? in my experience lately the ati 9800/128 bit boards at the consumer end have a lot of bang for the buck but you get a little special sauce (code) with the pro version with settings for LW that are supposed to help you get rid of weird artifact problems while viewing in 'sketch' mode and so forth. on a bit of a side note, whatever you wind up purchasing, make sure that you get a really solid and robust power supply because those things eat a lot of juice and can crash your machine if there's not enough to go around for the rest of the system. even though most of my customers are in the defense industry i see a lot of talent along the way and ultimately, the bottom line is in working with what you've got and just create (in their case, destroy..but that's beside the point). breaking into mainstream entertainment, your demo reel shouldn't rely on technology. it's all you. prospective employers will be more impressed with how YOU solve problems than your gpu's floating point benchmark. but if you need a push to crank it out no harm in that either..spend as little as possible and then when the time comes let them set you up with a dream machine by showing off how clever you are...not how hungry you are (because you blew every last cent on a card that will be outdated before the sun goes down)...unless you're going freelance. that's another story and you should get professional help (before you buy big and go crazy).

Stone
04-20-2004, 11:00 AM
Originally posted by alephnull
.. get to play with the really expensive stuff ..

and still they cant afford to give you a keyboard with a working enter-key? ;)

/stone

alephnull
04-20-2004, 01:07 PM
point taken. always hard pressed to throw that switch ever since i read john varley's novella 'press enter[]'.
made me paranoid i guess. sorry for my antisocial syntax.

now that i think of it, F9 seems to be a little sticky..mm

PTG
04-20-2004, 02:00 PM
Thanks for the input alephnull. I guess if the 9800 is the "top of teh low end", then waiting for the 6800 from nVidia on teh new machine I'll get next year should put me in some good company.

Who knows, maybe the FX 4000 card will drop to $200 by Q1 2005. :thumbsup:

CGTalk Moderation
01-18-2006, 01:00 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.