Bear in mind I don’t know what I’m talking about…
Sure, a 3d texture is fine. All you need for a normal map is a way of specifying a color at a location on the surface. It’s just a map. “Given this point in space, return a value.”
Two things going on with the normal calculation at render:
Get the color
Figure out how to shade the pixel
Get the colors of maps
This is just your 3d texture look up. “Oh, I’m at x,y,z on the object therefore I’m blue.”
Figure out how to shade the pixel
Given that blue, how well lit is it? This involves the angle between the surface and the light.
Add a specular component. This involves the from the camera to the surface to the light.
And THAT angle is wiggled about depending on the normal map.
So, for all this I can well see why they’d all want to be in the same space. So yeah, I bet you have to do that.
The sources you are working with will also have a transform from point on surface into texture space (uvw) in order to get diffuse and normal (and etc) map data. Now if your textures are all calculated in object space then you’re good. IIUC some solid textures use a normalized space so you may have to do a uvw conversion too. "Oh, the top of Godzilla’s head is at 3543,12993,40932 but the noise texture that defines his scales is 0 to 1 based, so normallookup(my_noise, x/maxx, y/maxy, z/maxz).
Right, just in case you understand all that and I’m missing the important point… “most involve knowing the uvws of the verts”… hm…
OK, http://www.terathon.com/code/tangent.html says that the tangent at the vert is aligned to the uvw. Right, duh… a normal is a unique vector, but tangent to an object is really a plane not a vector.
…because the tangent space normal map distorts the normal based on r=x g=y b=z at this location on the object rather than world space. So that distortion (0.2 units along the x axis) varies in real world direction based on the location on the object. If the direction that the normal map thinks x is is not in the same space as the light angle calculations you will shift incorrectly and your bumping will be wrong.
IIUC tangent space is like a phong interpolated virtual smooth surface wrapping the object. It probably contacts it at the verts. The circle surround the inscribed polygon (to use a 2d analogy). Now if all you had to do was phong shading then the normals would be all you’d need to calculate the correct shading in relation to the light and two normals imply three points implies one plane implies one angle determines one shade.
What you could do is say that every location on the surface has a triplet associated with it that shifts the normal by a specified amount before it is fed to the lighting calculation. Hey, it’s magenta there (255, 0, 255) so that means shift the vector positive x and positive z now calculate the light.
BUT what that means is that behaviour is not consistent with color. On the front of a house that yellow would shift the normal east and up to catch the morning sun. On the back of a house that yellow would shift the normal east and up to catch the morning sun. Except as you paint it (who paints normal maps?) yellow means “tilt right” and “tilt left” depending on where on the object you are and that’s not that obvious.
Wouldn’t it be good to have a system where cyan meant “do nothing” and magenta meant “tilt towards the left no matter where this bit of texture gets applied”? So we have tangent space normal mapping. I’m making this up as I go along. Which is great because you can use the exact same pixels for a rivet anywhere on the map because the normal map is only looked at up close by an ant crawling across the object.
!!! But there’s the rub. Which way is “left”? !!!
Left is negative u. So the tangent is used to rotate the distortion vector (the scaling inherent in uvw mapping probably isn’t important (at a guess)) of the normal map into alignment with the lighting calcs so that the surface normal is perturbed in the right (left) direction.
So, in short, if you want to use tangent space looking normal map you need SOME mechanism for determining the world (or object or light) space meaning of the normal map’s xyz perturbation. That task is usually handled by the uvw mapping.
DO YOU want to use tangent space looking normal maps? I’m thinking… Bump maps are easy. Fill the map space with black and populate it with little fuzzy white spheres and you get bumps where ever a polygons slices a sphere and makes a white dot on the surface. White is always “out” and black is always “in”.
My first inclination is that your procedural texture is easier to generate if you don’t have to worry about what it looks like depending on where it falls. Just make a noise texture based on purple, pink, and cyan and it will be right so long as you know which way is right and up. Magenta is always “left” and cyan is always “right” (or whatever the colors map to).
But that assumes you can generate the map correctly. Think of that little white sphere in the black again. If it gets intersected by the front of the house you want the east side of it to be magenta. If it gets intersected by the back of the house you want the east side of it to cyan.
!!! The COLORING of the normal map is DEPENDANT on the part of the object it corresponds to. That sounds like a bad way to define a texture.
Same problem with world space.
The meaning of a … texture voxel (volume texel?) in the 3d map is dependent on the surface that is using it. I intuit that that’s a generic “nature of normal mapping” issue and if the solution exists it is beyond the scope of this article.
If all you want is some noise stuff it may not matter (we still have yet to solve the “define x” problem). Even large scale Perlin noises and the like mayn’t be too harsh.
comments “Procedural normal mapping can be done any time you have a procedural height map - you just take partial derivatives of the height function dh/du and dh/dv, and let the normal be (1.0, -dh/du, -dh/dv). (Of course you must normalize it before using for lighting.)” Which sounds to me like bump mapping not normal mapping. But it does jibe with my first guess at “legit solutions”.
- not really
- yes
The real world may know better answers.