View Full Version : memory problems with long file seq. driving texture rate

09 September 2005, 05:07 AM
Hi, I have 1000 frame shot that has a surface emitting particles whose rate is driven by a file sequence. Problem is maya wants to load the whole sequence into ram as it goes, which is crashing my box with 2gig of ram in it. I have tried Disabling Memory Caching in the Dynamics menu but that does not stop it using all of my ram. Bot files maybe?

Can anyone help me out?



09 September 2005, 08:16 AM
hmm.. i found several problems with particles lately that involve some obvious memory leaks.
especially when using harware render buffer..

the only solution i got from alias was: create a particle disk cache.



09 September 2005, 12:48 PM
Thanks for the reply. Problem is to create a disk cache maya needs to read in the image sequence that drives the texture rate = crash due to runnung out of ram. I need a way for maya not to store the 1000 images in memory. I am exploring bot files but cannot find a way to convert sequences with makebot. Anyone have any ideas?


09 September 2005, 12:53 PM
in this case bot files wont help you at all, since they are only needed for rendering.
but for emission maya needs to read in the complete image.

but this is really strange since maya never seemed ablele to cache files like that.
i doubt it is the image files, but more the particles themselve..

have a try: let the simulation run some time until you have a high memory load and then run the command "flushUndo" in the script editor and look if memory drops.
i guess not but worth a try



09 September 2005, 01:06 PM
Yeah I guessed as much with the bot files. I will try the flushUndo command. Thanks. I dont think it is just particles in memory as when I use reduce the image dimensions by half ( but keep the emission rate the same much less ram is used and it doesnt crash, but I loose too much detail around the areas i am trying to emit from so this is not a solution.

I wonder if there is a command to flush image cache that I could run after each frame. I will have a look.

thanks again for the reply.

09 September 2005, 01:13 PM
emm.. just to be sure: you dont have switched on the "interactive sequence caching options" in the file node?

because by default maya does not cache the file in nodes with sequences..

please check that one



09 September 2005, 01:31 PM
nothing on in the "interactive sequence caching" options.

bizzare. just reloaded orignal high rez version of file seq and now normal memory usage.
oh well just half a day chasing my tail. gotta love maya. lets hope it stays fixed.

thanks for the help alexx.


09 September 2005, 04:16 PM
problems is back again. anyone have any idea how to stop maya caching image files? I know it is not supposed to be able to but it seems to be here......

Any ideas greatly appreciated.


09 September 2005, 06:13 PM
sorry i am done with ideas...

09 September 2005, 06:27 PM
Can you do the sequence in pieces? Like say do the first 250 frames, the next 250 and so on? I'm not sure what you're trying to accomplish but perhaps that might help.

07 July 2006, 07:54 PM
Hey, sorry a slight off topic, but this exact thing you're talking about, is something i'm trying to figure out how to accomplish...

Would you be so kind as to let me in on the secret, of how to connect an image sequence to an emitters rate to make it driven? I've tried everything pretty much...all hair pulled out


07 July 2006, 08:48 PM
sorry...just figured it out.... please ignore.

sorry to pollute your thread! :)

Hey, sorry a slight off topic, but this exact thing you're talking about, is something i'm trying to figure out how to accomplish...

Would you be so kind as to let me in on the secret, of how to connect an image sequence to an emitters rate to make it driven? I've tried everything pretty much...all hair pulled out


07 July 2006, 05:23 PM
hmmm.... interesting discussion. I guess I have the same problem! I hadn't narrowed it down yet to being because of the image sequence, but I guess that is in fact why my scene is crashing.
I've got a surface emitter, and a texture file with an image sequence to control emission rate. The image files are 2048x2048 pixels, because I need the detail. The scene is 217 frames. Everything runs fine until the last few frames, then maya stops emitting particles. And after that, if I open the attribute editor or do other stuff, maya crashes.

Well, got no answers for you, but I'll be looking too.


07 July 2006, 08:05 PM
Confirmed! The image sequence for the emission rate is why maya crashes. It eats up all the memory and when there's no more, it stops emitting particles. From there on a crash is next in line.

Only way to get past it so far, is lowering the resolution of the image files. Which is not really what I wanted to do.

07 July 2006, 10:42 PM
Ouch... this one sounds nasty. Turning "Use Interactive Sequence Caching" off would have been my first choice for sure. I have never experienced this before thankfully, but 2 thoughts that may help:

I have looked into the node, and it sems there are a few cache related options in there. try and set the file. From the docs:

Try turning on a hidden attribute on the file node:

(uca) Use Cache is provided for situations when memory is low. If you turn this on, rendering this file texture will use less memory, but it will also be slower.

Maybe try and setAttr file1.uca 1;
your way to freedom?

Another cache clearing option:

clearCache -all; (goes in to all of the data that can be regenerated if required and removes it from the caches, thus clearing up space in memory.)

For a bit of extra memory:

Consider running Maya from the cmd line. (No UI) That will free up a bit more ram for you.

And failing all of this, you could try and hack it:

Consider writing a mel based expression that

1.creates the file node, assigns the image and connects to the shader/particleShape node
2.runs one frame of the particle simulation
3.deletes the file node
4.flushes the undo cache
5.repeat from step 1 over the course of your animation

Its a hell of a dirty hack, but just may clean maya out enough to work with? Let us know if any of this works, I am sure others will benefit.

07 July 2006, 12:33 AM
Thanks Mike Rhone!

setAttr file1.uca 1;
does work! Maya doesn't gobble up all the memory now. However it seems that one must set this attribute before playing through the sequence. There's no memory freed up if you switch afterwards.

The clearCache -all command does nothing for me memory wise, corcerning this issue.

And btw, the "Use BOT" checkbox on the file node (what bigdog talked about) is actually switched on when you use setAttr file1.uca 1
In the maya output window you will see that it's "producing BOT file for D:/Maya....."

It does run slower this way, but keeps going and going, never running out of mem :thumbsup:


07 July 2006, 02:21 AM
Booyah! Good to know, and it is now filed away in my memory, because I KNOW that will come up again one day.

07 July 2006, 01:06 PM
Bigdog, what Maya version are you running?
This memory problem is known and logged as a bug in 6.x.
I was kinda expecting it to be fixed in 7.

07 July 2006, 01:54 PM
I don't know what bigdog was running, when he posted about a year ago, but personally I'm working with maya 7 on winXP - and had the problem discussed.

07 July 2006, 03:10 PM
Hehe I should have looked at the date...
Mike, this is great, if only I discovered this before ;-)

CGTalk Moderation
07 July 2006, 03:10 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.