PDA

View Full Version : Cg realtime shaders


Cman
11-26-2003, 10:31 PM
link (http://www.rendernode.com/articles.php?articleId=104&page=2)
3rd-Generation Vertex and Pixel Programmability

Enables real-time shaders to simulate a wide range of physical effects (such as fresnel effects, chromatic dispersion, reflection, refraction, etc.) and surface properties (such as casting effects, molded surfaces, etc.). This feature is very important for example in game development. If you want to preview some kind of effects in real-time, Vertex and Pixel shaders give you the ability to watch real-time rendered scenes in your viewport windows. 3ds max and Maya users have to install a Cg plug-in which enables this feature. Cg is native in SOFTIMAGE|XSI.

Does anyone know if LW will be getting a free plugin for this? Or will it be native?
Thanks.

telamon
11-26-2003, 10:35 PM
you need more than a plugin to derive benefits from the programability of the video cards I think.

Para
11-27-2003, 06:07 AM
This may sound a bit harsh but I personally wouldn't want Cg support to LW. GLSlang would be better if LW stays OGL, otherwise pure HLSL.

"Why?" you ask. Well, here's the main reason: Tomb Raider: AOD with Cg (http://www.beyond3d.com/forum/viewtopic.php?t=7422)

As you can see, Cg is a bit slower than HLSL on GFFX and a lot slower on Radeon. I'd hate to be tied to buy a certain 3D card just to run LW.

That was my .02 euro cents.

telamon
11-27-2003, 10:56 AM
It seems that the most recent releases of Maya and Max use the Cg standard.

OpenGL was also a feature proposed by a single vid card manufacturer at the beginning as far as I remember. Fortunately it became a standard.

If something new improves workflow and work comfort, I see no problem to implement it in my favorite software ;)

Para
11-27-2003, 11:57 AM
I can't see how implementing Cg would be better than GLSlang or HLSL... :hmm:

telamon
11-27-2003, 12:18 PM
me too :beer:

noclar7
11-27-2003, 08:32 PM
its kindof , ohh idont know, .. weird though. That if 3 out of the 4 top 3D vendors have implemented this solution why wouldnt lightwave via a plugin as well. If its a plugin, I cant see any harm in it. Lightwave has been trying to break into the gaming market since 6.0, why stop now? I am a quadro FX owner (1000) and I would love to see this implemented.

Cman
11-28-2003, 01:19 AM
Not really trying to compare softs. The quote was to show there are plugs, more than anything.
I guess I should have asked "when" will LW be getting a plugin.
So...
Does anyone know who is working on one or will it be native?
Sorry if I ruffled any feathers. :thumbsup:

Brett H.
11-28-2003, 01:58 AM
I have to think (in my limited but humble opinion) that if LW doesn't implement some form of support for vertex/pixel shading, that it will be burying itself in the gaming market. One connot hope to compete in the game market without competing in the newer Direct X9/pixel/vertex shader techonology currently at hand.

Let's hope LW is attempting (at least) to implement some of this technology.

Brett

Para
11-28-2003, 07:22 AM
Originally posted by noclar7
its kindof , ohh idont know, .. weird though. That if 3 out of the 4 top 3D vendors have implemented this solution why wouldnt lightwave via a plugin as well. If its a plugin, I cant see any harm in it. Lightwave has been trying to break into the gaming market since 6.0, why stop now? I am a quadro FX owner (1000) and I would love to see this implemented.

Of course it's understandable that Newtek would implement Cg since others have done it so, but it'd have few major issues:

Cg is equivalent to HLSL. Actually they are almost the same and shader code written to either HLSL or Cg will compile in both shader language's compilers. So there would be no idea to use Cg when you get "more bang for the buck" from HLSL since Cg is slooooow :p

The bigger problem for LW&Cg is that Cg is Direct3D based shading language and so is HLSL. LW is 100% OpenGL, so if Newtek would want to implement Cg, they would have to rewrite every OGL part and replace it with D3D. That's why I think that GLSlang would be a better solution for LW since GLSlang is about the same as HLSL pixel shader and it's part of OpenGL 2.0. As a note, OGL2.0 support in drivers (both ATi and nVidia) is coming soon, in fact ATi already has a set of drivers that support GLSlang...not officially though. nVidia has promised OGL2.0 support for christmas.

E_Moelzer
11-28-2003, 07:48 AM
Never forget the Mac- LW- userbase!
This is why I have to agree that GLSLang would be the better approach.
The driver- support is still not that great though.
I think it will take some time until good drivers get available.
CU
Elmar

telamon
11-28-2003, 10:27 AM
Maya has also a mac version and Alias has implemented Cg in its software.

Who is making the mistake? Newtek or Alias? I don't know personnaly. We'll know that in the future.

Para
11-28-2003, 11:09 AM
Originally posted by telamon
Maya has also a mac version and Alias has implemented Cg in its software.

I had to check; the Cg is only available for Windows Maya 4.5 and 5.

telamon
11-28-2003, 12:06 PM
don't check, my statement was incomplete. I meant Cg shaders are implemented in the windows version ;)

noclar7
11-28-2003, 01:31 PM
Thanks for clearing up the technicalities Para, and dont worry Cman, no feathers ruffled here ..lol. I'm actually finding a bit more peace of mind over here as compared to the newtek forums where everyone is going crazy waiting for 8.
On that note, I understand that LW8 is supposed to have some Open GL enhancements. I remember them saying something like, ( now all those fancy card features can be used )<- this is not quoting anyone, just the jist of what I got out of it.

Brett H.
11-28-2003, 10:52 PM
The bigger problem for LW&Cg is that Cg is Direct3D based shading language and so is HLSL. LW is 100% OpenGL, so if Newtek would want to implement Cg, they would have to rewrite every OGL part and replace it with D3D.
This is a good point, I hadn't really thought about it. LW will not be able to display D3D effects, since it is entirely OGL. But if you've seen what these effects can do in real time, you've seen what I think is the future of visual effects in gaming, as well as arch/vis walkthroughs, etc.

Brett

Para
11-29-2003, 07:17 AM
Originally posted by Brett H.
This is a good point, I hadn't really thought about it. LW will not be able to display D3D effects, since it is entirely OGL. But if you've seen what these effects can do in real time, you've seen what I think is the future of visual effects in gaming, as well as arch/vis walkthroughs, etc.

Brett

Can't deny that :)

Actually, if you are interested about the shader effects in general (just watching them work or coding), I suggest you go to Humus' site (http://esprit.campus.luth.se/~humus/) and under 3D just download everything and see if they work :)

Most of the tech demos are DX9 though.

samartin
11-29-2003, 08:55 AM
I'm pretty sure that LW5.6 used to have the option for directX viewports. NT seemed to have dropped it with the release of 6.0+ :shrug:

Brett H.
11-29-2003, 08:08 PM
On the old computer I run at home, none of the effects would be visible anyway...and at work I really don't have time (or permission) to play around...:)

I wasn't aware that LW ever did DX. I do know that the entire interface is currently OGL.

Seriously, I think Newtek has to move toward some implementation of these real-time effects. There's a reason why ATI and nVidia have spent all these research dollars in vertex/pixel shading technology for DX9. The demand is there for gaming, and arch/vis definitely benefits as well.

If it takes running LW in DirectX, I'd be willing to do that (Max gives you the option of OGL or DirectX display). I just wonder if it'd take a complete rewrite to allow DX display?

If you haven't seen it, you have to check out Real-Time Rendering with Natural Light (http://www.debevec.org/RNL/) from Paul Debevec. Simply amazing lighting effects in real-time. Imagine the implications for architectural walkthroughs, as well as gaming.

Brett

DaveW
11-30-2003, 06:48 AM
Originally posted by Para

The bigger problem for LW&Cg is that Cg is Direct3D based shading language and so is HLSL. LW is 100% OpenGL, so if Newtek would want to implement Cg, they would have to rewrite every OGL part and replace it with D3D.

CG uses OpenGL, and it works on OSX and Linux. HLSL is created by Microsoft, so I assume it's DirectX only.

Para
11-30-2003, 07:41 AM
Originally posted by DaveW
CG uses OpenGL, and it works on OSX and Linux. HLSL is created by Microsoft, so I assume it's DirectX only.

I should've checked that too. Seems like it works through ARB_fragment_program in OpenGL. I think I have to go ask about the Cg speed in OGL from the big guys, who know stuff...

Anyway, I still think that GLSLang would be better :)

paul k.
11-30-2003, 04:53 PM
How diffucult is it to learn to write custom shaders in Cg? Or is it strictly built within the 3D software? I really got excited about this when I watched some of the real time XSI shader Demos, and I'm definately really interested in doing stuff like this but I have never written anything but expressions. What is the curve like for the common artist?

noclar7
11-30-2003, 10:18 PM
this might be of intrest to some:

http://www.cgshaders.org/

and

http://developer.nvidia.com/page/home

tburbage3
12-01-2003, 07:55 PM
I think there is some confusion here regarding cg, HLSL, etc.

cg is from nVidia and is both a shading language specification and a toolset. The language in its first incarnation is almost an exact duplicate of HLSL as they were developed collaboratively.

In the vertex or pixel shader script, you specify what rendering environment you want to target -- which can be DirectX or OGL -- and with various levels of support specifiable (i.e. you can go for greater compatibility at the expense of some bleeding edge DX/OGL features if you choose). The idea is that you can write shader descriptions in a common language which separates shader high level specification from the low level implementation provided via the drivers and graphics hardware. cg does not force you to go one way or the other. It does not force an nVidia specific agenda on the end users of the shaders.

Vertex shaders basically allow calculations of vertex transformations (motion, deformation) to be offloaded from the CPU to the GPU. However, the topic of this thread seems to be more related to pixel shaders, so...

Usage in an app like LightWave of pixel shaders might be divided into (at least) several categories:

1) Ability to assign shader scripts in appropriate locations in the UI and rendering/animation pipeline such as Surface Editor (pixel shaders) for surface or volumetric effects;
2) Ability to preview pixel shader effects in OGL, in VIPER, or both. Either VIPER, or a standalone render window could be enabled to display a DirectX preview while not affecting the main OGL design time display system;
3) Ability of the renderer to generate final imagery taking the pixel shaders into account;
4) Ability to directly create and modify pixel shaders within the UI (which could be implemented as a plugin as, I assume, it is in Maya and XSI -- both also OGL dominant in terms of design time display);

Initially, LW might be a simple consumer of shaders but without necessarily providing UI for creating/editing them directly. Both cg and ATI's RenderMonkey provide help on the design-time front.

I think starting to phase in support for hardware-based rendering for both design-time preview as well as final imagery is a must for LW -- though will be surprised to see it with v8. The potential for 3D apps in being able to bypass the CPU and take advantage of the graphics hardware is huge and is going to really take off in the next couple of years. I hate to see LW getting left behind.

noclar7
12-02-2003, 03:04 PM
here is some interesting news about CG being integrated into a app capable realtime hardware/software rendering

http://www.lightworkdesign.com/full_pr.php?prid=78

Cman
12-02-2003, 10:24 PM
I've been looking forward to something like Lightworks in LW for about a year now.
I guess, as my original query goes unanswered, I'll have to hope someone takes it up as a challenge or something. :shrug:

CGTalk Moderation
01-16-2006, 06:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.