I am writing my own raytracer and now I have struck upon a problem I have not been able to solve so far.
I am trying to calculate the influence of a triangle-light (a triangle that emits light) on a point on another triangle. For this I want to calculate the surface area of the projection of that triangle-light upon the unit sphere around the point.
I fount the following explanation of how to calculate this:
http://mathforum.org/library/drmath/view/51774.html
This seems to use the same method I found somewhere else, so I suppose the explanation is correct. My code however is incorrect, for the resulting value gets larger as the triangle gets farther from the unit sphere, which is of cource incorrect. My code (C++):
Point normal1 = (triangle->vertex1 - intersection) * (triangle->vertex2 - intersection);
normal1.normalize();
Point normal2 = (triangle->vertex2 - intersection) * (triangle->vertex3 - intersection);
normal2.normalize();
Point normal3 = (triangle->vertex3 - intersection) * (triangle->vertex1 - intersection);
normal3.normalize();
float area = acos(normal1.dotproduct(normal2))
+ acos(normal2.dotproduct(normal3))
+ acos(normal3.dotproduct(normal1))
- (3 - 2) * pi;
-Triangle is the triangle-light;
-intersection is the center of the unit sphere;
-I implemented the operators - and * (cross-product) and dotproduct (inner product) for points and they seem to be correct, as I use them a lot in other parts of the raytracer and there are no problems there;
-area is the resulting value.
What am I doing wrong?