Nucleus Substeps and caching issue.


#1

Hi all,

I have an nCloth simulation thats being driven by some underlying animation. The animation and nCloth have been animated and simmed in real time (25fps) and look good. For various shots I will need to slow down the animation and cloth to emulate a 50fps or faster shutter.

Creating an nCache with default settings and slowing it down doesn’t work as the linear interpolation of ripples traversing the cloth doesn’t look right.

I’ve tried changing the ‘Evaluate every’ attr when creating the nCache but this completely changes the look of the simulation. It doesn’t matter if I lower the nucleus nodes substeps to compensate, as in if I change the ‘Evaluate every’ attr to 0.1 and set my nucleus substeps to 1 (from 10) the sim looks wrong. Same goes for increasing the oversampling on the dynGlobals1 node, the sim doesn’t look correct.

There must be a simple way of doing this. What I don’t understand is how are the nucleus solvers substeps different from over-sampling the timeline…

If the nucleus node has calculated the position of a vert 10 times between frames surely I should be able to save those positions out to a cache easily without affecting the look of the sim?

Thanks for any pointers!


#2

If you need to slow down for the final output, do it in post. Cache and render the whole dynamic animation by playing every 0.5 frame, rendering every 0.5 frame and using the right oversampling and caching settings. You could do this only for the part that needs to be slowed down of course, but by doing it for the whole animation you would have less headaches and more flexibility in post.


#3

I think it should work if you set evaluatedEvery to 0.1 and save every to 1.0. You may need to increase substeps if you want more detail in the sim than for the faster version. Then cache. On the cache set the scale to 10 and when you playback it should be 10x slower with every frame a simulated one, not interpolated.

With substeps it attempts to better resolve the simulation, and the substeps is a base iteration value per step. If one simulates on subframes, it will use less iterations and substeps per step in order to preserve the nature of the simulation, but will bottom out at one iteration. The substeps are really defined in terms of iterations per 1/24th of a second, not iterations per frame.


#4

Disregard my last post… it doesn’t seem to work properly. I’ll see if I can find a solution that works for you.


#5

Thanks for looking into it Duncan, I assumed the caching would work as your described but I’ve tried every combination I can think of (and bent the ear of everyone in the department) and it doesn’t quite work.

What I thought was happening was that if I cache with Evaluate Every at 0.1 then the nucleus node was sampling each of those 0.1 frame steps with 10 substeps, altering the look of the simulation.

I’ve kind of resigned myself to having 3 lots of dynamic settings, one for 25, 50 and 100fps. Its a shame as I’ve spent a couple of days getting the amount of ripples and the speed they’re travelling looking good at 25fps, I was convinced it’d be simple to over sample and have it looking the same just with the option to ramp the speed up or down.


#6

The problem is if I play back or cache at 0.5 the simulation looks completely different, not just a little bit but all the dynamic settings look out, and there’s not a clear fix like 'Scale the time/mass/drag by a set amount.


#7

Hmm… So you got something you already love and look forward to slow it down…

Maybe you could still pull out something in post, with time warp/stretch tools. Nuke and AE have such tools but it could end up look unacceptable depending on your shot.

Chronosculpt from Newtek can edit timings of alembic files if I’m not mistaken, so maybe you could look into that too. At least till Duncan comes up with a proper workflow tip. :wink:


#8

So a couple more observations.

I did a quick test of dropping the cloth I have onto a plane in real time, then did the same with the nucleus time scale set tp 0.25. On speeding up the scaled one by x4 they look pretty similar.

I think the problem comes in when I use my input mesh animation to drive the dynamics, (using a painted input attract map) I’ve definitely slowed down the animation to 0.25 speed but it seems like the underlying animation is moving too fast for the simulation to keep up with.

I am having some success with increasing the scene scale attr on the nucleus node. I’m now simming with 1unit:2m, instead of 1unit:10cm and it kind of reacts as if its slow mo, which I can then speed up to make it realtime. Again not ideal but I think it might work for what we need.

A simpler more accurate workaround would be good though :slight_smile:


#9

I have this EXACT problem currently.

Have some cloth who’s motion heavily relies on an input mesh.

If I bake to Alembic (or nCache), stepping through on half frames, the sim looks totally different. Like the air density just doubled or something. (Quarter frames is twice as bad again)

Initially thought I could double the time scale attr to compensate, which kind of helps, but the results don’t clearly negate the effects of solving on half frames.

There doesn’t seem to be a logical combination of settings that simply add “simulation resolution” (for lack of a better term) on sub frames.

The goal is to be able to ramp the frame rate up and down arbitrarily to go between regular motion and slow motion.

It goes without saying, simply baking on whole frames and slowing down the cache playback, is not an appropriate solution. That is interpolation.

Would be great to hear some more thoughts on this! :slight_smile:


#10

Hi,
This issue is still existing in 2020.
Did anyone found a solution?
Cheers, D


#11

The solution I’ve found over the past few years is to use Houdini. So much of maya seems to be abandoned to development these days.

I know that doesn’t solve your current problem but ncloth is a closed box, unless autodesk release a fix you’re screwed.


#12

@ddankhazi
is it just for doing a slowmotion effect ?