PDA

View Full Version : Displacement Map observations


policarpo
04-13-2004, 05:30 PM
Did some tests and here are my observations.

Sorry that the image is so big, but it had to be. Would like to hear any solutions cause my head hurts trying to figure this one out.

http://www.policarpo.us/samples/displacement.jpg

SP1R1T
04-13-2004, 05:34 PM
Agreed.

It's really, really frustrating to see this. :curious:

samartin
04-13-2004, 05:55 PM
Nice to see Poli, I used your image supplied and LW just completely sux (exactly the same results), if you also rotate the image map using the envelope controls it's inconsistent in the way the displacements are carried out...

C4D is at least doing a pretty good job and maybe even using Z-Brush2 data might even be good enough for non-film specific work...

Using the NormalDisplace with procedural maps does give good results however, just not image maps...

policarpo
04-13-2004, 06:16 PM
Yeah, it is quite annoying to see, but I just wanted to make it plain as day that this is something that needs to evolve.

The future is hear...and darn it, we need it.

I've sent off an email to Worley to see if this true displacement tech could be folded into fPrime. Let's hope so.

:beer:

jjburton
04-13-2004, 06:39 PM
Policarpo- Thanks man...I just did some of my own tests and got about the same results. This is very important to me for a number of reasons including the fact that I'm making render choice decisons for my final project. I had hoped to use displacement technology in it...argh! Let's hope we hears somthing soon from Newtek and/or Worley about it.

policarpo
04-13-2004, 07:02 PM
If anyone uses 3dsmax or Maya or XSI, could you take the displacement map and attach it to a sphere and render it so we can see the results?

Oh and supposedly Poser 5+ supports subpixel displacement as well...but who ever heard of using Poser as a rendering solution?:drool:

Cheers.

ghopper
04-13-2004, 07:15 PM
Thanks for that.

How come the C4D render time is longer than the LW one ?

But at least we are able to use normal maps for now, right ?

policarpo
04-13-2004, 07:32 PM
Originally posted by ghopper
Thanks for that.

How come the C4D render time is longer than the LW one ?

But at least we are able to use normal maps for now, right ?

My guess is because it is actually trying to do a good job of using the Bitmap as a displacement map instead of ignoring it completely. :drool:

policarpo
04-13-2004, 07:45 PM
another comparison.

i know that this is common knowledge to some of you, but I am currently determining if I want to get zBrush2 or not...and since I use LW and Cinema...i think these tests are of some value.
:applause:

http://www.policarpo.us/samples/displacement2.jpg

policarpo
04-13-2004, 08:10 PM
and just in case anyone missed this great and illuminating thread. it's made me so envious of those other rendering engines out there...
:cry:

http://www.cgtalk.com/showthread.php?s=&threadid=135581&highlight=zBrush

Mattoo
04-13-2004, 09:06 PM
Well it's not quite all that bad. You just need to settle down on the displace distance and use a bit more tesselation (read: a lot more tesselation).

(see attached image - not the dinosaur one)

Also my dinosaur model in the last 3DartToPart contest used quite a bit of Normal Displacement (actually "Bump Displacement", it has less artifacts).
http://www.3darttopart.com/images/Dino_Matthew_Painter.jpg

Cinema4Ds' implementation does indeed look quite impressive, I don't see any down side to it and don't know why that wouldn't be film worthy.

LW certainly has fallen short in innovation over the last few years. I'd like to give those in control now at Newtek the benefit of the doubt as they had their attentions elsewhere. However it's an unforgiving world and if the slack is not picked up soon....

I truelly believe that ZBrush is the future for creature and character modelling in the foreseeable future, if LW cannot complement or compete it will lose it's place rapidly in high end usage.

bloontz
04-13-2004, 09:42 PM
I have Zbrush 2, I've only toyed with it briefly so far but one major problem that I came across was that Lightwave doesn't seem to support 16 bit grayscale images. I haven't tried all formats yet, if anyone has had success with that I'd love to hear about it. I think that the 16 bits of info in the Zbrush displacement maps help a lot with sharp details. Lightwave would need support for them to get the best results.

Eugeny
04-13-2004, 10:05 PM
In few words LW Displacement is SUCK

Here is my tests ...


http://www.geocities.com/chuchae2002/Junk/LW_Displacement.txt



Few notes here :
As u see more tessellation (sub patch) - more details u get in all tests. As Matthew said don't use so big displacement.

Interesting thing i found in Normal displacement - is ability to blur image (and increase quality) with texture antialising. Unfortunately this not working with Texture displacement ...

Sub d order from First to After Displacement don't affect anything with Normal displacement and work only with first in Texture displacement case (so i can't figure Policarpo how can u get these results with Texture displacement and Sub d order with After Displacement).
Normal Displacement with Sub d 20 just stacked my LW.

The best result can be achieve with Normal Displacement with Morph mode - u need some offset morph prepared from modeler.

Few tips:

Don't get Normal Displacement with high level of Sub d - or even if u have object with lot's of polygons and Sub d -
it's can crash LW, so go to Modeler , disable Sub patch , return to Layout and enable Normal Displacement, set it to Morph and then reactivate Sub patch in Modeler.

To get lower render time use display and render sub patch at the same level.
(Why so long renders Policarpo ? My tests are made on Dual PIII 1000 768 SDRam)

And another question did u use LW 8 or i missing something ? :)

I'll make some Max tests tomorrow at my job.

Sensei
04-13-2004, 10:38 PM
Originally posted by bloontz
I have Zbrush 2, I've only toyed with it briefly so far but one major problem that I came across was that Lightwave doesn't seem to support 16 bit grayscale images. I haven't tried all formats yet, if anyone has had success with that I'd love to hear about it. I think that the 16 bits of info in the Zbrush displacement maps help a lot with sharp details. Lightwave would need support for them to get the best results.

LightWave SDK image loader is able to create and load images which has float (32 bit) component, which gives 96 bits per pixel resolution...

Just in theory..

In practice I don't know whether there're some loaders which uses this feature..

lwbob
04-13-2004, 10:45 PM
Yes, displacement in LW needs some work but this threads seems a little trolltastic.

Finkster
04-13-2004, 10:58 PM
I couldn't help but have a go at this one!
Here's my effort:
http://members.lycos.co.uk/cmoloney/Pics/Displacethis.jpg
I used a subD order of 15 and Fprime for rendering, so it only took a few seconds to get a nice result. Obviously not perfect, but definitely useable.

I agree with much of what Eugeny said. Here's my tips for a good displacement in LW:

1.If you value your sanity, never, never, never use normals direction to control the normal displacement (eh?!?). Use a morph map created by using the smooth scale tool in modeler. It does exactly the same thing as a normals calculation and won't drive you to suicide.
2. In my experience best results are obtained when sub-division order is before displacement. Otherwise you will be displacing a low-res mesh rather than a nice high res one!
3. Pretty high SubD order is necessary, maybe 15-20.
4. Keep your displacement maps high res too, if you want the best results, obviously.

I don't see why LW shouldn't support 16-bit displacements from Zbrush. One of LW's strongest points is that it uses such high-range calculations and images. Will these displacement maps not load at all or not translate into good displacements?

Here's another example, showing that the displacements hold up pretty well even when zoomed on very close to the object surface:

http://members.lycos.co.uk/cmoloney/Pics/Hairscape.jpg

Happy testing!

policarpo
04-13-2004, 11:04 PM
Awesome Eugeny and Finkster, thanks for the input.

I'll look at tripling my polygons or at least making them a hell of a lot more dense so there's more for the engine to chew on. fPrime is the only way to do these dense tests now a days.

Great work Mattoo...thanks for the insight...
And lwboob..well...i won't entertain your response cause you've defined your intent fully. :)

jjburton
04-13-2004, 11:11 PM
Thanks Finkster...I'll have a go at this...:)

edit- question, if the subdivide is before displacement, that means that MDD morphed meshes wouldn't work, no?

Alan Daniels
04-13-2004, 11:19 PM
Originally posted by lwbob
Yes, displacement in LW needs some work but this threads seems a little trolltastic.

I agree, this is a somewhat trollish thread. Lighten up, guys! You know that LW's renderer only does per-vertex displacement mapping, so naturally the effect isn't going to look as good as that from a renderer that uses micro-polygons (like Cinema4D's). I can live with this limitation for now. In fact (correct me if I'm wrong), Maya's and 3DS's built-in renderers don't use micropolygons either, so LW isn't that far behind the curve.

That being said, I sincerely hope that after 8.0 ships, Newtek concentrates ALL its efforts into getting the renderer up-to-speed for 8.1. I want it to have the same capability as newer renderers such as Brazil and VRay. I belive Newtek realizes how important it is to have a world-class renderer, so I'm not worried. Plus, I'd hate to have to go spend a bunch of money switching apps. :)

(P.S. I definitely want great looking output, but I don't necessarily want LW's renderer in the same ballpark as PRMan or Mental Ray. I'd still like to be able to render scenes without writing C++ code and fiddling with 200-odd dialog options. )

Brett H.
04-13-2004, 11:19 PM
1.If you value your sanity, never, never, never use normals direction to control the normal displacement (eh?!?). Use a morph map created by using the smooth scale tool in modeler. It does exactly the same thing as a normals calculation and won't drive you to suicide.
This begs for some sort of tutorial, even a short one. I'm dying to try this on some models (I can just see painting displacement like you would paint a texture, the possibilities are endless), but I just don't get what is meant by using a morph map created by smooth scale. I'm by no means a noob, but that's just a bit too vague for me to figure out.

Brett

policarpo
04-13-2004, 11:37 PM
Not really sure what is trollish by this thread...just showing what my current setup is and what the limitations are in both LW and Cinema since Displacement Map textures is the next wave of image making in the industry.

Just think of what it would feel like to do your displacement painting in something like Body Paint or Aura and be able to render it nicely in LW or Cinema because it now supports micropolygon displacement rendering? I've submitted the request to NT via feature requests for good measure as well.

If I hadn't posted this friggin' thread we wouldn't have found out about Mattoo, Eugeny and Finkster's approaches to solve this dilema.

Ah well...nothing like complacency to show ones true collars. :)

Thanks again you three for the insights. (note: never ever ever Subdivide your object with Metaform twice in Modeler while Layout is still open with the Sub-D preview set to 15...whole system freeze.)

:drool:

lwbob
04-13-2004, 11:47 PM
Because it was posted in the Lightwave section telling everyone how much better the other programs are. Especially by someone that usually counters user complaints with telling them to email newtek's lwfeatures email.

It wasn't posted in any way as a positive way to explain how to get around the problem.

Mattoo
04-13-2004, 11:48 PM
I don't think it's trollish at all. It was a perfectly good question/thread. If you'd already known the answers/hacks - then maybe....

BrettH. In the Normal Displacement plugin you can set it to displace based on a Morph, so all you're doing is giving the displace direction and amount, so it doesn't have to work them out (which is mucho slow).

It is quicker but I haven't noticed it fix any of the real problems. ie, UV seams and random non-displaced vertices.

bloontz
04-13-2004, 11:58 PM
Originally posted by Sensei
LightWave SDK image loader is able to create and load images which has float (32 bit) component, which gives 96 bits per pixel resolution...

Just in theory..

In practice I don't know whether there're some loaders which uses this feature..

I have no problem loading 16 bit per channel rgb images in the formats that are supported. It's when I try to load a 16 bit greyscale (in the same formats that load fine as rgb) that lightwave fails. I agree, it seems that it should be able to handle it but it doesn't seem to like 16 bit greyscale. I need to play with it further, it's possible that converting the greyscale maps to rgb may work.

policarpo
04-13-2004, 11:59 PM
Originally posted by lwbob
Because it was posted in the Lightwave section telling everyone how much better the other programs are. Especially by someone that usually counters user complaints with telling them to email newtek's lwfeatures email.

It wasn't posted in any way as a positive way to explain how to get around the problem.

Did you even read the copy in the initial graphic?

Both apps are problematic and the only seeming viable solution of a High Quality nature is to use zBrush for rendering until we get a usable solution in LW (or until someone figures out how to solve the issues we face with workarounds).

And as I stated before, I emailed NT about the issue.

Man alive...how the hell do you learn anything if you don't try and break it or at least discover work arounds to the problem? I generally show the problem and hope people can help me out. Thank goodness for the good people like Eugeny, Mattoo and Finskter.:buttrock:

another LW zBrush example (http://www.pixolator.com/zbc-bin/ultimatebb.cgi?ubb=get_topic&f=1&t=015018)

Sensei
04-14-2004, 12:16 AM
Originally posted by bloontz
I have no problem loading 16 bit per channel rgb images in the formats that are supported. It's when I try to load a 16 bit greyscale (in the same formats that load fine as rgb) that lightwave fails. I agree, it seems that it should be able to handle it but it doesn't seem to like 16 bit greyscale. I need to play with it further, it's possible that converting the greyscale maps to rgb may work.

What file-format you checked? I would like to test it here..

Sounds like bug or unsupported file-format...

lwbob
04-14-2004, 12:32 AM
Originally posted by policarpo
Did you even read the copy in the initial graphic?

The stuff in the pointlessly large image (not everyone stretches their browser across two screens) or the stuff posted as a message?



And as I stated before, I emailed NT about the issue.

Yeah but you sayt all the time you are leaving the cgtalk-lightwave forum so this is why a post starting off the way you did sounded trollish like I said. If that wasn't the intent you know at least two people saw it that way.

bloontz
04-14-2004, 12:35 AM
Originally posted by Sensei
What file-format you checked? I would like to test it here..

Sounds like bug or unsupported file-format...

So far I've tried the default psd that Zbrush gave me, tiff, and rpf. I've had a hard time finding ways to convert images using 16 bit per channel. I think the utility i tried for rpf may have been faulty as it wouldn't load the rpf itself. I replied to the post that Policarpo posted above to ask what format he is using, his results look good.

And thanks for starting this thread Policarpo, it has been informative.

jjburton
04-14-2004, 12:49 AM
lwbob- I think you might should calm down man. He brought up legitimate issues and through that I, for one, may have found a way to use lightwave more in my final project at school. I hadn't been able to find that information before even though I looked.

As to the image size, yeah it could be smaller, but I don't see that it's really that big of deal, it allows you to see the detail in the renders which helps in understanding what's going on.

ChrisBasken
04-14-2004, 01:10 AM
Originally posted by Alan Daniels
I agree, this is a somewhat trollish thread. Lighten up, guys! You know that LW's renderer only does per-vertex displacement mapping, so naturally the effect isn't going to look as good as that from a renderer that uses micro-polygons (like Cinema4D's). I can live with this limitation for now. In fact (correct me if I'm wrong), Maya's and 3DS's built-in renderers don't use micropolygons either, so LW isn't that far behind the curve.

*apologies for newbieness*

What are micro-polygons?

Shade01
04-14-2004, 01:15 AM
Is it time for me to pull this thread? The next off topic comment/complaint and I'll close this down.

Mattoo
04-14-2004, 01:19 AM
Ok, I mean't to post this in the other thread ages ago but never got around to it.
Here's some Dinosaur hide. For the 3DArttoPart competition I thought I'd spruce up my 4 year old T.rex with the some funky new Displaced skin... good idea... hmm...?.. sort of.

Anyway, images speak louder than words:
http://web.ukonline.co.uk/matthew.p3/DispProblems.jpg

As you can see, the Normal Displacement gave much sharper definition but gave me inexplicable dings and dents (undisplaced vertices). The Morphed version doesn't suffer from this but is noticeably less defined.

I ended up using Bump Displace, which still wasn't as sharp as Normal Displace but it was a little better than a Morph Displace and of course it had not annoying pin pricks in it.

I hope my woes save someone some experimentation.

After the contest I spent some good time updating the creaky old T.rex model for the intent of ZBrushing it..... I'll have to see how that goes, the UV map was a bugger.... almost entirely continuous to avoid seams. Should be fun. I'll post when I get something.

(apologies for the essay) :D

Mattoo
04-14-2004, 01:41 AM
I forgot to mention (although I did state this in the other similar thread), LW's real stumbling block isn't that it doesn't HAVE sub-pixel/Micro-poly displacement - it's that LW can't deal with UV seams that is the killer.

It's not that LWs Normal Displacement is actually broken as such, it just doesn't have a function for handling the seams. The other renderers had the same problem, and still do in certain situations. But they added functionality to deal with the problem.

I'll certainly forgive micro-poly/sub-pixel hooja-ma-flip for a more robust displacement method first. I'll happily wait for the micro-poly(etc)...

Which does make me wonder. Normal Displacement in LW is just a plugin... all it would take is someone to write a better one, one that deals with the seam issue.... :lightbulb

NanoGator
04-14-2004, 02:24 AM
Out of curiosity, have you tried resizing the image down to the resolution of vertices you have there? I've had better luck by doing that...

Mattoo
04-14-2004, 02:28 AM
Originally posted by NanoGator
Out of curiosity, have you tried resizing the image down to the resolution of vertices you have there? I've had better luck by doing that...

Of sorts. I tried first cranking up the texture antialiasing. Nuthin.
Then tried some FPBlur on the texture... nuthin.

I know it's not the texture, I can swap it out for another - or even a procedural and they're still there. And there's nothing funny about those vertices either, some are in areas where there is nothing but quads.

I've had it happen quite a few attempts. On my Gremlin avatar - if you look real close on the original image you can see some. :(

NanoGator
04-14-2004, 02:32 AM
Hmm I'm not sure if blurring it would be the same as resizing it down... I'd try it just to rule it out, at least. I mean, think about what LW's doing here. It's looking at a B&W image, determining its intensity, and translating the vertex by that # times the maximum threshold..

Finkster
04-14-2004, 02:33 AM
Originally posted by Brett H.
This begs for some sort of tutorial, even a short one.

Here you go: intro to Normal displacement (http://members.lycos.co.uk/cmoloney/Pics/Intro%20to%20NormalDisplacement.jpg)

jjburton

Thanks Finkster...I'll have a go at this...

edit- question, if the subdivide is before displacement, that means that MDD morphed meshes wouldn't work, no?


Not sure about that one at all, not much experience in that area I'm affraid. Sounds like it may be problematic, but someone is sure to know a workaround.

bloontz

So far I've tried the default psd that Zbrush gave me, tiff, and rpf. I've had a hard time finding ways to convert images using 16 bit per channel.


Have you tried HDRShop (http://www.debevec.org/HDRShop/) , it can read .tiffs and can output several HDR file formats that LW can read.

Jake
04-14-2004, 02:42 AM
Finkster, you rock!!! That intro page is beautiful.

Concerning the problems people have with seams in the uv mapping: do you encounter the same problems if you set your uvs up in Zbrush2 and then use that mapping solution for the displacement?

Brett H.
04-14-2004, 02:42 AM
Thanks for the link, Finkster, I had never read that. It's a great primer course for anyone interested in what the heck is up with all this "Normal Displacement" stuff you've been hearing about...

Brett

Mattoo
04-14-2004, 02:46 AM
Originally posted by NanoGator
Hmm I'm not sure if blurring it would be the same as resizing it down... I'd try it just to rule it out, at least. I mean, think about what LW's doing here. It's looking at a B&W image, determining its intensity, and translating the vertex by that # times the maximum threshold..

Hmm, I'm not too sure what you're getting at there? I blurred it so much that the texture was near enough grey. That effectively ballooned it into a big fat dinosaur... with little pin pricks in it - in the same places.
As I said, I tried different images (pretty much randomely) some much lower res.... and I still got it. If I apply the same image to a different mesh it's fine.

There's something funky with the mesh, I'd happily dump the thing out as an .OBJ (effectively cleaning it) and re-import but it would lose all the weighting..... not good.

I'm sure I'll figure it out eventually, too busy right now with other stuff before I get back to this. Was just wondering if anyone had seen it before? I get it quite a lot.

jjburton
04-14-2004, 02:49 AM
Many many thanks finkster...I'll see if it works with the mdd.

Mattoo
04-14-2004, 02:50 AM
Originally posted by Jake
Finkster, you rock!!! That intro page is beautiful.

Concerning the problems people have with seams in the uv mapping: do you encounter the same problems if you set your uvs up in Zbrush2 and then use that mapping solution for the displacement?

It wouldn't make any diff. UVs are UVs, regardless of where they were edited.
The UV seams problem is a renderer problem - not a problem with the mesh itself.

bloontz
04-14-2004, 02:50 AM
Thanks Finkster, I'll have a look at HDRShop. I received a reply from the Person on the Zbrush thread that Policarpo linked above and was informed that I should just convert the greyscale to a 16 bit/channel rgb and it will work so I'll try that. His results look very good.

architook
04-14-2004, 03:14 AM
> The UV seams problem is a renderer problem - not a problem with the mesh itself.

Forgive my ignorance, but what is "the UV seams problem"?

Is it LW showing a ghost of the texture color around the edges of a UV polygon when it's doing texture antialiasing? Probably from using a blurry texture but blurring stuff from outside the UV region?

Spacemanbob
04-14-2004, 03:33 AM
Ok maybe I am lost here. I am adding this as texture as a bumpmap and it looks fine minus the artifacts from it being saved as a jpg.

Am I doing something wrong here. I tried to add it as a texture displacement under blending mode and nothing a happens. When I just add it for the bump it shows fine. I'm lost I guess

This was done in Fprime in 10 passes.

http://www.spacemanbob.com/3Dobjects/samplebumpmap.jpg

Maybe I am just to new to understand this part of this. Hmm..

Cman
04-14-2004, 03:35 AM
Originally posted by Finkster
Here you go: intro to Normal displacement (http://members.lycos.co.uk/cmoloney/Pics/Intro%20to%20NormalDisplacement.jpg)



Not sure about that one at all, not much experience in that area I'm affraid. Sounds like it may be problematic, but someone is sure to know a workaround.



Have you tried HDRShop (http://www.debevec.org/HDRShop/) , it can read .tiffs and can output several HDR file formats that LW can read.

great little tute!!

Cman
04-14-2004, 04:11 AM
Originally posted by Spacemanbob
Ok maybe I am lost here. I am adding this as texture as a bumpmap and it looks fine minus the artifacts from it being saved as a jpg.

Am I doing something wrong here. I tried to add it as a texture displacement under blending mode and nothing a happens. When I just add it for the bump it shows fine. I'm lost I guess

This was done in Fprime in 10 passes.

Maybe I am just to new to understand this part of this. Hmm..

Yes you are missing something.
A bumpmap does not actually displace the surface.

policarpo
04-14-2004, 07:11 AM
Very cool dudes. Thanks for posting the info.

I really appreciate it as do others I am sure :beer:

PixelInfected
04-14-2004, 10:29 AM
after i saw the "Dinosaurs making of" i play with displacement map of lw, subdiv and some optimization way.

first problem i found is uv seam, but with some alpha on dispacement image i go around (bit tedious and time wasting) this problem.

second problem was the high poly resolution of mesh, be cause i need high subdivision resolution to have fine dectail.

i find a partial solution working with a long and annoying pipeline.

1) build mesh and subdivide it
2) animate it with bone
3) use a plugin to save transformed the mesh for every frame
4) go in modeler and with batch optimize a bit all objects produced
5) reload animated mesh with old object replace plugin.

work well if you not optimize at extreme, and it's faster than original mesh (be cause lw not need to calculate point motion for subdivision, but for simple mesh).

problem :
it's long pipeline, and motion blur not work correctly on mesh be cause it not move its point but replace it.

i lost my test in hard disk crash and i never restart it be cause i not found enought time to restart, but if anyone could be helped from my idea...
have a nice day.

Alan Daniels
04-14-2004, 02:53 PM
Originally posted by ChrisBasken
What are micro-polygons?

Chris, since nobody answered your question, I'll take a crack at it...

Micro-polygons are when the renderer takes polygons or subdivision surfaces (Sub-Ds), and chops them up into polygons so small, that each one will be guaranteed to be smaller then whatever size the renderer is sampling at. For example, if you don't have anti-aliasing turned on, the renderer does one sample per pixel, and so with "micropolygons" turned on, the renderer would make sure each polygon is smaller than a pixel.

The advantage of this, is that you end up with no aliasing artifacts when using Sub-Ds or displacement maps. For example, in Lightwave, you know how you have to set the "subdivision level" for each object that contains Sub-Ds? It defaults to 5, and typically you crank it up to 10 or 20 if you want nice, smooth detail. Basically, you're telling LW how finely you want it to "chop up" the Sub-Ds for rendering. But, if LW's renderer supported micro-polygons, you'd never have to enter this. Instead, the render would chop up the Sub-Ds for you automatically.

The obvious advantage is that you end up with better looking Sub-Ds, and any surfaces that use displacement maps. The disadvantage is that it's difficult to implement correctly, which is why not all renderers support it. The brute-force approach would be to mindlessly slice-and-dice until every relevant polygon was smaller than a pixel (or whatever sample size), but this would explode the geometry in a scene to a HUGE amount of data. So, the programmer writing the renderer has to be more clever, and generate the micro-polygons on an as-needed basis.

This difficulty is similar to getting "motion blur" right. Sure, a programmer could write it using a brute-force algorithm, but the result would be so slow, or use so much memory, that it would be unusable for production use.

(Disclaimer: This is my understanding from my own research. Anyone who's worked with high-end renderers such as mental ray, and sees anything wrong with what I've said, please feel free to correct me. I want to make sure I'm providing accurate info.)

Hope this helps.

ostov
04-14-2004, 03:31 PM
Do you guys want a render from xsi/3dsmax (mental ray) just for checking out the qualty and render time... :)

policarpo
04-14-2004, 04:26 PM
Originally posted by ostov
Do you guys want a render from xsi/3dsmax (mental ray) just for checking out the qualty and render time... :)

If you wanna...but we know it will look perfect since MR supports micropolygon displacements.

:-)

My next big question is how do we take advantage of displacement maps generated in zBrush 2.

Are we facing UVMap and Displacement mapping limitations? If someone can articulate this, could you email lwfeatures@newtek.com so that this issue can be brought to light.

Since we rely on so many plugins in LW, it seems that there might be a solution for resolving these issues. It would be cool to see a resolution in 8.x cause I would love to be able to take advantage of zBrush (i am probably getting it in August or so...or at least after a demo is made avilable).

SP1R1T
04-14-2004, 04:47 PM
Originally posted by PixelInfected

first problem i found is uv seam, but with some alpha on dispacement image i go around (bit tedious and time wasting) this problem.


Anyone have some more detail, or better yet, a tutorial on how to do this? I'm not quite clear on how an alpha could fix the UV seam.

If it can, whats the disadvantage, and why aren't more people doing it?

NanoGator
04-14-2004, 05:09 PM
Hmm I wonder if C4d's displacement actually performs an extrude... well that'd explain it's good quality I think.

ChrisBasken
04-14-2004, 06:55 PM
Originally posted by Alan Daniels
Micro-polygons are when the renderer takes polygons or subdivision surfaces (Sub-Ds), and chops them up into polygons so small, that each one will be guaranteed to be smaller then whatever size the renderer is sampling at. For example, if you don't have anti-aliasing turned on, the renderer does one sample per pixel, and so with "micropolygons" turned on, the renderer would make sure each polygon is smaller than a pixel.

Okay, so it's just sub-Ds with some kind of automated intelligence behind them.

Thanks! :thumbsup:

Mattoo
04-14-2004, 08:56 PM
Originally posted by SP1R1T
Anyone have some more detail, or better yet, a tutorial on how to do this? I'm not quite clear on how an alpha could fix the UV seam.

If it can, whats the disadvantage, and why aren't more people doing it?

I believe he means that you would have your standard Displacement map. But the UV edges on the texture fade off. Another texture - with different mapping would cover over the previous maps seam with a fall off on toward it's own UV seams. Seamlessly blending together, very much in the same way texturing was achieved in the past, before UVs were available.

Atleast I think that's what he means. I've thought of that but it sounds like more hassle than it's worth. I'd rather give up and do it in Maya than mess around with that.

Mattoo
04-14-2004, 09:05 PM
Originally posted by policarpo


My next big question is how do we take advantage of displacement maps generated in zBrush 2.

Are we facing UVMap and Displacement mapping limitations? If someone can articulate this, could you email lwfeatures@newtek.com so that this issue can be brought to light.



I emailed Newtek with a bug report about this 3 years ago with these problems.
This was all brought up about that time on the Lightwave mailing list....... kind of shows how stagnant things have been. :shrug:

I still bought ZBrush 2 with this knowledge, appart from the fact that I happily use Maya also, there is also the fact that ZBrush (and no doubt other, future competitors) is the future of organic modelling currently.
I'm not gonna wait around for LW to catch up before I jump onboard, I just can't afford to for my own marketability in the job market.

Finkster
04-14-2004, 09:23 PM
I hope this doesn't drag us too off topic, but here's my question anyway:
Other than the ability to animate and pose your Zbrush enchanced models in LW (or others), are there any reasons not to stay in Zbrush entirely? Does it's renderer lack features, quality, what's the story?
I guess I'm coming from the perspective of someone who doesn't animate. Would it be redundant to go the Zbrush to LW route if all you do is still art?

Per-Anders
04-14-2004, 11:06 PM
Originally posted by NanoGator
Hmm I wonder if C4d's displacement actually performs an extrude... well that'd explain it's good quality I think.

nope, c4d's just moves points along their normal. though i think the comparisum is slightly unfair as the lw mesh resolution in that first image was clearly far far lower than the cinema mesh resolution.

e.g. if that started out as a default sphere that was then put into a hn object you're looking at 1,130,784 quads. of course if it was just a cube then that's only 24,582 quads. the lightwave mesh looks like it's maybe 80k or so, so you'd expect such results.

neither render engine currently supports mpd or spd features (though you can of course raise your subdivisions in both to create obscenely detailed meshes for stills at least).

policarpo
04-15-2004, 12:00 AM
Well i learned quite a few things in this thread.

My initial test was on a 24 segment sphere that I then subdivided using the Tab key and set the render subd levels to 8 in LW.

The c4D sphere was a basic sphere primitive with a HN modifier attached and a setting of 6.

I learned that the way i was doing it in LW was wrong and that i actually needed to up the subD level to 15 or so and also needed to do a few more things to get it to look presentable.

I'll take that knowledge and see how I can use it to the utmost considering my current setup.

Whenever I get around to buing Z2 I'll probably end up just rendering in there since full support of displacement maps won't be an issue.

I hope that in the meantime, some creative people figure out how we can take full advantage of Z2 while using our LW and C4D tools.

Cheers!:beer:

bloontz
04-15-2004, 12:05 AM
Originally posted by Finkster
I hope this doesn't drag us too off topic, but here's my question anyway:
Other than the ability to animate and pose your Zbrush enchanced models in LW (or others), are there any reasons not to stay in Zbrush entirely? Does it's renderer lack features, quality, what's the story?
I guess I'm coming from the perspective of someone who doesn't animate. Would it be redundant to go the Zbrush to LW route if all you do is still art?

A lot of people seem comfortable with Zbrush for rendering and it does seem to produce nice qualities. It has GI lighting that looks good. It doesn't have a lot of other things like volumetrics or particals. The problem I have had with it is that it is very awkward to set scenes up, at least for me. It is not a true 3D environment. You can only have one 3D object active at a time and once you have placed it and "snap shotted" it to the canvas it is not possible to return it to 3D. There is only one orthographic view which can't be rotated. It's a very different kind of work flow. It does allow you to do some alterations using what they call 2.5D techniques. I suggest that you try the demo, though only the previous version is currently available as a demo at the moment.

Jake
04-15-2004, 12:30 AM
What he said.

Zbrush is obviously capable of producing good renders. But to some who comes from a 3d background, the workflow is really aggravating. In terms of working with a mesh, it lacks a lot of things 3d users take for granted, like multiple orthographic views. modeling in Zbrush, I have considerable difficulty keeping my oriented to a specific axis. Moreover, if you're used to doing point modeling--forget it. I don't think it even allows you to weld points. Granted, I haven't been using it for very long, but I don't have as much difficulty jumping into any other 3d app.

Ramon
04-15-2004, 01:13 AM
Originally posted by Mattoo
Ok, I mean't to post this in the other thread ages ago but never got around to it.
Here's some Dinosaur hide. For the 3DArttoPart competition I thought I'd spruce up my 4 year old T.rex with the some funky new Displaced skin... good idea... hmm...?.. sort of.

Anyway, images speak louder than words:
http://web.ukonline.co.uk/matthew.p3/DispProblems.jpg

I ended up using Bump Displace, which still wasn't as sharp as Normal Displace but it was a little better than a Morph Displace and of course it had not annoying pin pricks in it.

I hope my woes save someone some experimentation.

After the contest I spent some good time updating the creaky old T.rex model for the intent of ZBrushing it..... I'll have to see how that goes, the UV map was a bugger.... almost entirely continuous to avoid seams. Should be fun. I'll post when I get something.

(apologies for the essay) :D
Hey Matthew, that REX is awesome! Very nice displacements as well. I tried to visit your website but it only has a generic "forbidden" message that comes up. What's with that?
Anyways, I have also been trying to use bump displacement on the chosen axis (is that axis the one in which your morph map faces?) but, when I crank up the subD levels on render to 20, it hangs LW. The texture map I am using is 4k res and I have 2 gig ram and a dual xeon 2.8
Any ideas for the hang?

Mattoo
04-15-2004, 02:21 AM
Originally posted by Ramon
Hey Matthew, that REX is awesome! Very nice displacements as well. I tried to visit your website but it only has a generic "forbidden" message that comes up. What's with that?
Anyways, I have also been trying to use bump displacement on the chosen axis (is that axis the one in which your morph map faces?) but, when I crank up the subD levels on render to 20, it hangs LW. The texture map I am using is 4k res and I have 2 gig ram and a dual xeon 2.8
Any ideas for the hang?

Cheers for the comments Ramon. Yup, my website (for what it was worth) is no longer - thought it was kind of silly keeping ancient work hanging around on a website that was supposed to have WIPs on it.
The setup I was using to create the Morph displace is exactly the one Finkster descibes in his mini tut, using the smooth scale in modeler and creating a morph that the Normal Displace plugin uses.
As to the hang - are you sure it's a hang? Take a look at Task Manager and see if LW is still grinding away.
Taking the T.rex as an example, the base mesh was about 7000 polygons, that was cranked up to about 1.5 million once I'd set the Sub-div value upto somewhere between 10 and 14 (can't remember which). At about Subdiv 12 my 1Gb RAM failed me and it would start using virtual ram.

However, I'd setup the Normal displace (or Bump Displace as I eventually used) first with a low SubDiv display level, just to get the displace height right. Then I'd crank it up pretty high (a learned guess) after I was done and go make a cup of tea.... or read War and Peace.... seriously, I think the T.rex must have taken a good 20 minutes for it to displace at that high a subdivision.

One thing that I have found will hang LW sometimes is if you go back DOWN the subdiv levels with the Normal Displace plugin in use. In those instances I dissable "Enable Deform" in the scene tab under globals - then drop the tesselation down.

Hope that helps.

bloontz
04-15-2004, 02:36 AM
Jake, there is a zscript in the Quicklinks section of the zbrush forums called QuadQuick that simulates multple orthographic views. It's pretty limited, you have to manually refresh to get views to update after editing in the main view. I use the shift key alot to snap the model to orientations.

To clarify for those considering zbrush as a rendering app-

Zbrush really began life as more of a painting app that allowed you to paint with 3D objects as well as what they call 2.5D technigues. You really create a final image directly on the canvas that retains depth information so you end up with a static drawing but since it contains depth info you can alter the lighting, add depth of field, fog, shadows, and reflections.

Some things that zbrush doesn't have, true perspective (a biggy), no camera, lights aren't objects (you place them on a little sphere, like bryce)

The thing that attracted 3D folks was the ease with which you could manipulate a mesh using the unique set of tools provided. The developers took note and added more features catering to 3D modelers, like zspheres and texturemaster.

Jake
04-15-2004, 03:51 AM
Thanks bloontz! I'll check it out.

Ramon
04-15-2004, 09:51 AM
Originally posted by Mattoo
Cheers for the comments Ramon. Yup, my website (for what it was worth) is no longer - thought it was kind of silly keeping ancient work hanging around on a website that was supposed to have WIPs on it.
The setup I was using to create the Morph displace is exactly the one Finkster descibes in his mini tut, using the smooth scale in modeler and creating a morph that the Normal Displace plugin uses.

Hope that helps.
Man, I don't know why I'm not understanding the use of smooth scale in the morph map for the purpose of displacement. I must be a fool for not understanding this but, isn't the purpose of a displacement map so that you don't have to worry over the tedious modeling of details and poly flow etc, etc,.. by the use of a grayscale image map to facilitate this? If so, then where and why does the smooth scale come into play? What is it that is being "modeled" with the smooth scale function? What purpose does it serve?
I'm sorry that I don't understand this. Please enlighten me because I absolutely love the details on your T-rex - they're fabulous! If you are feeling up to it and could - share a low res screenshot of your morph map for the rex and a low res the displacement map - would that help me understand? Man, I really want to know how this works and I feel foolish for not "seeing it".
Thanks a bundle! That really is the sweetest Trex model I've ever seen, really.

jjburton
04-15-2004, 12:18 PM
Ramon,

My understanding of why that is helpful is that you're "teaching" the displacement where to go. By using it as a base for the displacement, LW has "tracks" on which to displace the geometry. I could be speaking out my rear here, but that's how I've understood it:).

Finkster
04-15-2004, 04:16 PM
That's right.
The smooth scale essentially displaces every point in the model along its normal, by an amount you define. In the normal displacement plugin, you apply a map which controls where and how much this morph (smooth scale displacement) will be applied. White areas will apply the smooth scale 100%, black areas will not apply it at all.
The advantage of using a morphmap rather than letting the computer calculate the normals, is that we have already told it where each point should end up, if fully displaced - pretty much what a normals calculation would end up with, without the lengthy calculation.

Ramon
04-15-2004, 07:33 PM
To jjburton and Finkster:
Now I see. Thanks for your elaboration. I don't know why I was thinking that the morph map being refered to was that of a custom UV map, you know, "the flattened mesh". I don't know why that was in my head. Thanks for clarifing. So then to re-cap: the reason to use a smooth scale morph is so that LW doesn't try to calculate displacement of normals for possibly millions of polys - rather, it "shows" the displacement function the "low res cage" as a guide.
Very cool idea. Thanks y'all, I'm gonna try that. I'm just a visual artist - never like crafts although for things like coming up with this idea, I wish I was a bit more crafty.
Thankfully, I have you all as support.

NewAgeTitan
04-15-2004, 09:48 PM
:bowdown:

This is excellent...it is helping me already.

CGTalk Moderation
01-18-2006, 01:00 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.