PDA

View Full Version : Maya Api: Tangent, binormal


nimod
02-24-2005, 06:55 PM
Sorry for the repost, but I think no one in the Mayaforum can help me.

Hi.
I've written a Lightmapping Mental Ray Shader for Maya. The Shader works great, but I have some Problems with the spaces. The calculations are in worldspace. But the Shader has something to do with Normalmapping. That's, why I need to do the calculations in Tangent-Space / Point-Space. That wouldn't be a Problem, but I realized, that I can not access the Tangent in Mental Ray. That became a serious problem, because that shader wouldn't work at all.

So my question is: Is there any way to access the Tangent of the current point, that is calculated? Or do you see any other way. I just have an vector (x,y,z), that needs to be changed to tangent space for every calculation.

Thanks for the help.

rakmaya
02-24-2005, 09:34 PM
I didn't quite understand what you wrote. If you email/post the shader code, we can help you better.

If you are normalmapping, you need the Tangent. Not only that, you need to take Tangent, Normal and Binormal into the World space where the Light/Camera is.

To take the thing to Tangent Space you need to to multiply the light vector by the Tangent/Binormal/Normal matrix (Row1 = Tangent etc...)

Psuedo Code would be like the following (This is not in any specific language format, just psuedo code)

Wrld_tangent = Tangent * WORLD_MATRIX
Wrld_normal = Normal * WORLD_MATRIX
Wrld_binormal = CrossProductOf(Wrld_tangent, Wrld_normal)

TangentSpaceMatrix = CreateMatrix(Wrld_tangent, Wrld_binormal, Wrld_normal)
tangentLightDir = lightDirection * TangentSpaceMatrix

//WORLD_MATRIX is the Trasformation matric for the object. From here on just use the tangentLightDir to calculate Lighting. Remember you have to convert Half-Angle vector into tangent space as well (if you are doing specular lighting)

->>>> The above code is knowledge base. It means that is the algorithm to to the tangentspace calcualtions (whether it is Lightmapping or something else you want to do in tangentspace)

Either way you need the Tangent to create the tangent space matrix. If you cannot get this, you can render it into an image and use the UV coordinate to get it from the image file. I am talking from the HLSL/Shader development point of view. I am not sure of the Mental Ray restrictions on this. Someone more experienced in the MentalRay area should be able to translate all the above into Mentalray without much effort.

nimod
02-24-2005, 09:57 PM
Hi.
Thanks for your reply.
I thought about your solution with the extern file. That was my worst case scenario, because I would mean a lot of useless workaround ... work. :) But I guess, this is the easiest way to do it.
I was hoping for an solution inside of mental ray. My Problem was not the calculation, but the pure access to the information (the tangent) in mental ray. I mean, how does an simple tangent space normalmapping shader works is mental ray? ( this is not what I'm doing, but this would require the same information)

rakmaya
02-25-2005, 01:22 AM
I haven't digged into the Mental Ray shader coding yet. No time between programming and modeling. By sunday I will have more time and will look into it. I am gona need it pretty soon anyway. So will see what I can do.

So there is no way to access the Model's Tangent from MR? Hm... I would guess there might be some way other than external image. I can't really think of us using MR in such a condition. Tangent is a very important property used in many effects. Generating Tangent file for one or few is ok, but a large dynamic environment such as Water, (Any Fluid feature of Maya) is a pain in the b*** There must be some way!!!

nimod
02-25-2005, 10:51 AM
Thats exactly my opinion! :) Thanks for your help. I'll be using the weekend too, to find something.

playmesumch00ns
02-25-2005, 01:02 PM
You can get to it, forget how exactly, but I think it has to do with writing a displacement shader or something that gets run first, then attaching the tangent information per-vertex that you can access in your later shaders. Something like that.

nimod
02-25-2005, 04:18 PM
That gives me new hope. :)
Do you mean a Maya-Displacement Shader?
I've read, that the new Maya 6.5 Upadate Api contains the ability to read the tangent. Maybe we can write a simple Node, that gives that information.

gga
02-26-2005, 02:43 AM
For maya, state->bump_x_list and state->bump_y_list will give you smooth tangent information, as long as you are mapping following your uv coordinates (ie. no projections). That works for both meshes and nurbssurfaces. In the case of projections, maya allows baking projections into uv coordinates, but some precision is usually lost, as the info is baked on the vertices.

For xsi, i believe by default the xsi scene converter will not spit out tangent information into bump_x_list by default, but it will give you tangents for surfaces in state->derivs[0-1]. I think xsi offers similar features as maya overall, but you may need to turn a switch in the object thou.

For 3dmax, I am not sure, as I've never used it.

Besides that, for projections, you can, with some math work, calculate the tangents analitically, something I know xsi's shaders come with, unlike maya's and 3dmax's shaderset.

Finally, for any package and mapping method, you can always approximate the tangent information as is usually done in games. If you look at the 3 texture coordinates of the triangle vs. the 3 positions of the 3 vertices of the triangle being shaded (the stuff you can get from mi_tri_vectors()) you can then see the problem as a linear equation with 3 unknowns, which can then be solved by any standard matrix solving method. The precision and accuracy of those tangents will be largely dependent on the size of the triangle and they will usually not be continous along the surface, but for a lot of uses that works just fine.
Probably one of the most straight forward non-sense explanations on this is Matt Pharr's cgrr post from some time ago:
http://groups-beta.google.com/group/comp.graphics.rendering.raytracing/browse_frm/thread/1e9bc073003f3b1b/81369c026e9b0e9a?q=Matt+Pharr+dPdu&_done=%2Fgroups%3Fq%3DMatt+Pharr+dPdu%26qt_s%3DSearch+Groups%26&_doneTitle=Back+to+Search&&d#81369c026e9b0e9a

If you are looking for a mental ray shader that will give you that and that you can plug in a shading network, you can download and use gg_showinfo in mrClasses.

nimod
02-26-2005, 03:11 PM
Thanks for the detailed information. I'm using maya, so I guess I'll use the state->bump_x_list and state->bump_y_list Method. But why is there only x,y? Is z allways 1?

gga
02-26-2005, 11:18 PM
Thanks for the detailed information. I'm using maya, so I guess I'll use the state->bump_x_list and state->bump_y_list Method. But why is there only x,y? Is z allways 1?

You just don't have a Z, as the vector pointing in that direction is (approximately) the actual shading normal of the surface.
It's just that the naming that mental images chose for those state variables is just really poor, as x and y are, traditionally, used to identify vertex coordinates.
Usually these derivatives are often referred as dPdu or dPdv (or dPds and dPdt, depending if your software calls the mapping coordinates UVs or STs) or tangentU and tangentV, as maya's hypergraph calls them.
Understand that state->bump_x/y_list[] are not a single float, but actual vectors (actually, they are an array of vectors, where state->bump_x_list[0] corresponds to the dPds of your first uvmap, state->bump_x_list[1] corresponds to the dPds of your 2nd uvmap if present, etc).
Also, note that state->bump_x/y_list[] are not normalized to unit length, so you usually want to do something along the lines of:
miVector dPds = state->bump_x_list[0];
miVector dPdt = state->bump_y_list[0];
mi_vector_normalize(&dPds);
mi_vector_normalize(&dPdt);

Also, note that all state variables are defined in the "internal" space of the renderer (this means that you are basically not supposed to care about what an unit in those vectors means, as the vendor may change how the renderer internally works. And this has already happened before. In the case of mray3 "internal" space means world space units but in mray2 and earlier it meant camera space).
To make sure your vectors are defined in object space, you should use the mi_vector_to_object() function to transform a vector from "internal" coordinates to object coordinates. Thus, you'd have....

miVector N;
miVector dPds = state->bump_x_list[0];
miVector dPdt = state->bump_y_list[0];

mi_vector_to_object(&dPds, &dPds);
mi_vector_to_object(&dPdt, &dPdt);
mi_vector_to_object(&N, &state->normal);

mi_vector_normalize(&dPds);
mi_vector_normalize(&dPdt);
mi_vector_normalize(&N);

// Use N, dPds, dPdt to do whatever you want here....

nimod
03-02-2005, 09:34 AM
Hi gga.
Thank you so much. After a few problems it finally works. And it works great. :)
Thank you for your help.
Greets,
nimod

CGTalk Moderation
03-02-2006, 10:00 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.