I have a custom shader that has multiple texture channels and I am trying to get the effects of multitexturing to display properly in the Maya realtime interface.
As far as I understand it, there is no way to directly get a custom shader to display multiple UV sets at once--so I believe the "standard" solution to this problem is to use a Maya native Layered Texture node for realtime rendering purposes.
This means that the network I think I need is:
myShader.texture1 -> layeredTexture.inputs \
myShader.texture2 -> layeredTexture.inputs /
Sorry about the funny formatting ; Both textures go to layereTexture.outColor and then to myShader.outColor. This creates a cycle in the dependency graph, but I think that may be OK in this case.
My question is how I actually implement this.
I have a C++ class that defines myShader, and it has functions that get called on various events, but no function that gets called on the creation of an instance of the class.
Also, I'm not sure what the best way to actually create the Layered Texture node is through the API. I've been looking in the docs for an MLayeredTexture or the like, but I've had no luck.
Does anyone have any experience or advice for creating custom shader nodes that work with multitexturing? I would be very greatful for any help people could give me.
Edit: One thing I forgot to mention is that I think there is additional trickiness to do with the materialInfo node for HW texturing. It's a known bug in Maya that for custom shaders the default materialInfo connections are wrong and until you fix them the textures will be all blurry. I know how to cope with that with my own custom shaders, but I'm not sure what to do if using a layered Texture node for the rendering.
Also, we're using Maya 5.0 :)