PDA

View Full Version : nCloth Simulation Too Heavy


isosceles
06-23-2008, 12:48 AM
I'm trying to optimize the ncloth/nucleus preferences for when I export to an ncloth-cache. Currently I must wait overnight to just see any sort of test and this has become tiresome. So I need help tweaking my preferences to make render times reasonable so I can see the effect that 'input mesh attract' animation is having. I need to be able to see how my animation is affecting the ncloth simulation, adjust accordingly, and re-simulate. Warning: this is a bit experimental use of ncloth.

So I import a model from Xenodream and apply ncloth; only changing the 'Air Density' to 10 (in the nucleus node). I then proceed to animate it using deformers and set the ncloth 'Input Mesh Attract' to .75. All other preferences in the nclothshape and nucleus are default to achieve the melty-goo effect. The current test you see below is fairly high poly and I thought that the complex poly was increasing cache render times. So I made a much lower poly version and the ncloth-cache render times are no different.

http://zombiedasein.com/fun/nanoswarm_test.mov

I am stuck! Any tips or help is greatly appreciated.

=======
ISOSCELES
http://www.zombiedasein.com
http://www.elkcloner.com

SpaXe
06-23-2008, 04:25 AM
There is a Quality Settings in your OutputCloth shape node, try to turn that down a bit. Also, your solver has a max iteration number.

The best thing is, once you get the timing down, you can key these numbers so they may take more iterations in the beginning, and then sacrifice a bit of quality for sim time since they all crash anyway later.

Let me know if that works for you.

Keep using the low-poly for sure.

PS: that looks really cool. I look forward to see some progress! ;)

john_homer
06-23-2008, 08:06 AM
I would imagine it is your thickness settings slowing down the simulation.

if the thinkness is more than the average edge length, simulation time suffers greatly.

how many faces is the model? and how long is it taking to solve per frame?

dont use default, use a preset. after setting your solver scale correctly (see docs)

.j

isosceles
06-26-2008, 10:04 PM
Thanks for the quality settings tip, that really helped get things going. For previewing animation I push the substeps, max-collision-iterations, and quality settings all down to one. Yes its very vert-glitchy, but I'm looking for that exact aesthetic for the final render.

The low-poly version was giving me too many problems; Maya's poly reduce has never worked too well for me. So I've decided to stick with the incredibly high-poly version. At 160,500 faces and solving at three seconds per frame, I'm surprised Maya can keep up!

I've read the docs and I still don't fully understand how to tweak the solver-scale. Any tips to help me wrap my head around it?

Sorry I forgot to mention thickness settings, I tweak it so early in prefs-adjustments that I forget to mention it! Definitely ran into that problem in the past...


I'll update with playblasts when the final animations are tasty.
Thanks fellow CGI hackers!

SpaXe
06-27-2008, 06:36 AM
Awesome, I'd love to see where you take it.

In regards to the solver scale, I thought it's just a matter of ... how fast the thing moves in time? I could be mistaken, but again, I haven't really used it that much to know.

john_homer
06-28-2008, 07:46 AM
isosceles, 160,500 faces3 simulating in seconds per frame! did you mean minutes???
or is this how slow it is playing back with cache??


In regards to the solver scale, I thought it's just a matter of ... how fast the thing moves in time? I could be mistaken...

yes, you are mistaken.. nothing to do with speed..
you might be thinking of "time scale" used for making cloth slow-motion etc..

space scale is more about the size you are working in...
from the docs..

Determines the relative space scale applied to this Maya Nucleus solver. The Maya Nucleus solver treats nCloth objects as a scale model and applies the specified forces internally to get the expected behavior for the actual nCloth object at its actual size. The default value is 1.

Gravity (http://forums.cgsociety.org/javascript%3Cb%3E%3C/b%3E:WWHClickedPopup%28%27Dynamics%27,%20%27nucleus.html#wp213813%27%29;) interprets Maya’s units as meters. When the working units of your nCloth’s scene is not set to meters (such as Maya’s default centimeter working unit), you may need to adjust the Space Scale (http://forums.cgsociety.org/javascript%3Cb%3E%3C/b%3E:WWHClickedPopup%28%27Dynamics%27,%20%27nucleus.html#wp213856%27%29;) of your nCloth’s Maya Nucleus solver. Otherwise, the large-sized nCloth objects in your scene may not behave as desired. For example, when Space Scale is 1.0 (default), Gravity treats a 100 centimeter wide nCloth object like it is 100 meters wide. To improve the behavior of your large-sized nCloth objects, reduce the Space Scale value.

If you are modeling such that one unit is equal to one centimeter, the Space Scale value should be set to 0.01.

isosceles
06-29-2008, 02:50 AM
To be more precise about my preferences: I lower the substeps and max-collision-iterations to 1. The quality settings are set to 100 (500 is default, lowering to 1 gets some insane verts-extremeglitch-beautyifyouloveittoo). I also turned off self-collisions.

With these prefs, I'm serious when I say that the simulation is running 160,500 faces at 3 to 6 seconds per frame (running on a Dual 2GHz PowerPC G5). The timeline scrubs slowly, but it updates enough to see rough movement without requireing playblast.

Very interesting to hear that 'time scale' can allow ncloth slow-motion... hmmmmm!

Thanks for the Space Scale explanation, I now understand. I've seen Duncan mention that "If you are modeling such that one unit is equal to one centimeter, the Space Scale value should be set to 0.01" ...but I never quite understood its implecations.

Here is a playblast! Thanks alot everyone, I can't wait to experiment with this in my free time.
http://zombiedasein.com/fun/nclothplayblast_motrack.mov

CGTalk Moderation
06-29-2008, 02:50 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.