Hey guys,
Thanks for your effort!
Had a lot of office communcation to do today but focused on the problem now again. Please see attached screenshot - I assume everything should be set up properly for baking. And I do get proper normal results through Vray, but still in a bad quality. Which I seriously dont understand… In my eyes a normal map should be a data-only texture, there should be no raytracing involved. Which makes totally sense as GI is turned off globally and it still renders the (fuzzy) output.
The screenshot is set up for Vray this time, if I switch back to Scanline all object settings should remain the same for sure, should be nothing wrong with those. There are still remains of Vray elements I could choose from in the render elements drop down, I’m just ignoring those. If I choose the standard normal map element type it still renders this plain color. Which is even more strange as the same result comes from Xnormal.
I did some baking tests some weeks ago and Xnormal did deliver 100% proper results. Still it doesnt really fit in the actual pipeline as I need to bake a load of elements, which ideally will be supported by a custom script inside 3DSMax. There is a discontinued script to batch bake objects with 3DS and Xnormals, but this isnt properly working and not the preferred way to go.
Ideally I try to stay within 3DS only to fulfil that task…
The strange thing about Xnormals and Scanline deliviering the same wrong (plain color only) results is, that for Xnormals the object’s setups need to be wrong already, somehow.
But to be honest, there IS not that much that CAN be done wrong, right?? :)
I’m already guessing that there is an evil bug somewhere, something I just CANT find… ^^
To adress your answers directly:
@darthviper107: HP and LP objects share the same space, the LP object being a totally reduced version of the HP object, all proper quad face modeled. Projection and cage is being set up by render to texture tool, tried changing the cage size, didnt help. Should work though anyways as I’m bulding a pipeline which should work semi-automatic with presets. Still, right way to find the errors - cage doesnt seem to be an issue. I’m using automatic unwrap / flatten option as well, had some good test results with it before so far. (Actually the objects will be pretty simple, so no need for perfect unwrapping or ideal UV placement, does the job
) Sure when I switched back and forth between Scanline / Vray I did set up the correct elements for each type…
@Noren: Actually yes, at least in this case - but in many future cases too. Its going to be a higher detailed version of specific furniture elements being projected on simple boxes of the same size. We need to be sure that both versions are exactly the same size for the later use. Still interesting to know - I DID have some artifacts problems but kind of managed to solve them out. GI wasnt deactivated this time, but had it deactivated before while trying to solve things - didnt make any difference at all as well. Question sideways: why should it make a difference, except speed improvements? It might happen that you would want to bake several element types at once, some need raytracing, some are data textures and dont. Still should work together, right?
Well, overall, (sorry for that long text, proves my confusion
) its still SO strange that is does work with Vray and doesnt in other ways, and Vray giving this crappy results.
Any other thoughts?
I will get back to the drawing board and start again from scratch, still I dont know what I could have done wrong…
Best,
Niko