PDA

View Full Version : Memory leaks in MEL


Augie
11-09-2012, 07:30 PM
Iíve run into a huge problem with just about every large MEL script Iíve written in the last couple of years. I tend to build tools in Maya to help me with tasks such as laying out navigation meshes in a game level or exporting vertex colors from hundreds of meshes to a binary file. When running these large MEL scripts, I always end up seeing Mayaís memory usage expand as the script runs until eventually Maya runs out of memory and crashes. I was using the 32-bit version, but ended up having to go to the 64-bit version because these scripts eventually cause maya to grow beyond the 4GB limit. Well, even the 64-bit version isnít a big enough band-aid to fix this problem. Mayaís currently at 9.4GB right now after having loaded 43,185 path nodes from tiny 3.2 MB a binary file. Maya never releases the memory until itís quit and re-opened.

I really donít know where these memory leaks are coming from. Iíve tried and tried to debug my code in the past to find the sources but I always end up having leaks galore anyway. The most common culprit seemed to be not clearing arrays after using them, but Iíve been very careful to always do that. Also, I have the undo buffer disabled and am going to the extra effort to flush the buffer anyway. History is off as well. Iím out of ideas on where to look.

Itís really becoming a productivity killer.

ZIP file containing MEL script and sample path node binaries (http://jupiter.lunarpages.com/%7Eaugiem2/LoadPathNodes.zip)
Just the MEL script (txt) (http://jupiter.lunarpages.com/%7Eaugiem2/LoadPathNodes.mel.txt)

To load a binary, first edit line 16 in the MEL file and change the path to the binary to load, then just copy and paste the entire script into Maya.

Keilun
11-09-2012, 07:41 PM
Can you share an example script? You're already touched on the two issues that most would suggest off the top of their head (undo/construction history). Beyond that, I think a script would be most helpful.

If you can't, then I'd try to isolate your problem by commenting out chunks of your script at a time, until you can get a skeleton of your script running without the apparent "leak". From there, add them back in until you spot the culprit.

I've never run into such a problem with MEL before, so I'm a tad skeptical and feel like there might be something in the way you coded the script or how you're executing it that might be causing the problem.

NaughtyNathan
11-09-2012, 09:01 PM
I've seen similar things before too and on the occasions where it was causing an issue I did a bit of digging and debugging and found out a few strange things and one pretty startling major thing out.
In one particular example I was iterating over vertices and performing a polyMoveVertex (iirc) on each vertex (or vertex cluster). I was using the "-ch off" flag so no construction history was being created and therefore memory would be conserved right?

wrong.

When this function was running Maya's memory use kept creeping up and up, and if it got too high Maya crashed. When I changed the command flags to "-ch on" to preserve the construction history (so Maya created a new history state every iteration) the memory usage stayed constant and never crept up!!

try this code. Quit Maya and start it fresh. then immediately run this code with your task manager open:
string $obj[2]=`polySphere -ch 0 -sx 32 -sy 32`;
string $verts[] = `filterExpand -sm 31 ($obj[0]+".vtx
")`;
for ($v in $verts)
{
polyMoveVertex -ch off -t `rand(1)` `rand(1)` `rand(1)` $v;
}
on my system (maya 2013) Maya starts at only 290Mb of RAM usage then goes up to 1Gb of RAM when this code is executed.
However, now quit and restart Maya and run the same code, but change the off into on before you run it so all construction history is created and preserved. Maya doesn't even go up more than 10-15Mb of RAM usage!!

something funny is definitely going on there...

Augie
11-09-2012, 09:38 PM
Can you share an example script? You're already touched on the two issues that most would suggest off the top of their head (undo/construction history). Beyond that, I think a script would be most helpful.

I've never run into such a problem with MEL before, so I'm a tad skeptical and feel like there might be something in the way you coded the script or how you're executing it that might be causing the problem.

I also think its something I'm doing wrong and not MEL in and of itself because I don't see much information about this issue in web searches.

I stripped out much of my path node system and just attached the loading algorithm in the zip file link below. Anyway, hopefully the MEL file will reveal some incorrect MEL programming or some other oversight.

ZIP file containing MEL script and sample path node binaries (http://jupiter.lunarpages.com/~augiem2/LoadPathNodes.zip)
Just the MEL script (txt) (http://jupiter.lunarpages.com/~augiem2/LoadPathNodes.mel.txt)

To load a binary, first edit line 16 in the MEL file and change the path to the binary to load, then just copy and paste the entire script into Maya.

Augie
11-09-2012, 09:48 PM
try this code. Quit Maya and start it fresh. then immediately run this code with your task manager open:


Very interesting! I have seen memory leaks with certain MEL commands before like this mentioned on other boards. I forget which ones. Anyway, mine behaved the same way yours did as well with the -ch off taking far more memory, but I noticed -ch off ran MUCH faster than -ch on with CPU usage < 50%. The task finished within about 2 seconds with -ch off, but took at least 10 seconds with cpu usage at 98%. Quite interesting and troubling at the same time.

Another weird thing: If you add the line
constructionHistory -tgl 0;
to the beginning of your code and remove the -ch flag from polyMoveVertex, you will also not get a massive memory bloat, it will be slow, and you will still get history on your final object. It acts as if constructionHistory was just left on for some reason even though it should have disabled it.

Just to test, I did try replacing all the -ch 0 references in my path node script with -ch 1 and I still got the memory bloating anyway. :(

haggi
11-10-2012, 11:15 AM
Would be interesting to check if it helps to turn off the undoQueue. The history only removes the dependencies, but not the undo.

Let's imagine you have a large geometry, modify it and turn off history, but keep the undoQueue. Then for every modification, the geometry is duplicated to be able to do an undo.

If your history is on, then the undo queue only keeps the modifier node, not the complete geometry. This could explain the memory demands if you turn off history.

Of course this explanation is completly useless if you turned off your undoQueue anyway :)

Augie
11-10-2012, 06:35 PM
Would be interesting to check if it helps to turn off the undoQueue. The history only removes the dependencies, but not the undo.

Let's imagine you have a large geometry, modify it and turn off history, but keep the undoQueue. Then for every modification, the geometry is duplicated to be able to do an undo.

If your history is on, then the undo queue only keeps the modifier node, not the complete geometry. This could explain the memory demands if you turn off history.

Of course this explanation is completly useless if you turned off your undoQueue anyway :)

I just tried NaughtyNathan's test with undoInfo -st off and you are correct! It now no longer bloats up PLUS it runs in a couple of seconds with -ch off. If you turn -ch on it takes 10-12 seconds. :)

Unfortunately this isn't the issue with my larger MEL script. I already have undo queue off and history off...

By the way, I've updated my posts to include the sample MEL script and some testing binaries.

EdtheHobbit
11-11-2012, 04:14 AM
I wonder if refreshing the maya interface would do anything at all? In my scripts, I tend to add a refresh; call to major loops, mostly because I enjoy watching the script at work. It slows things down to be sure, but it's fun.

But, I've also never seen this memory problem (at least, not that I've noticed).

Keilun
11-12-2012, 04:26 PM
I played around with the script a bit. Here are my notes:

1. I can definitely reproduce the memory issues. On Maya 2013 x64, it ramps up to about 2GB of usage for test1.bytes.

2. If you comment out line 560:

// Create the actual connection
$pf_connectionNames[$cId] = PF_CreateConnectionMayaObject($type, $cId, $node1, $node2);

The script runs a lot faster and does not exhibit memory issues. I also eliminated PF_PositionConnection as the source by commenting only that one out while leaving the above PF_CreateConnectionMayaObject enabled. This leads me to believe there's a problem with the duplicate call here. I'm not sure why. Is it possible for you to just generate the meshes from scratch given that they're fairly simple meshes? That would be my solution here.

3. Why not use the polyPyramid primitive rather than creating a cube and merging the top verts to create a pyramid? This also reduces memory usage and execution time.

$objs = `polyPyramid -w 1 -ns 4 -sh 1 -sc 0 -ax 0 1 0 -cuv 3 -name ("PFRef_pathnode")`;

$objs = `polyPyramid -w 1 -ns 4 -sh 1 -sc 0 -ax 0 1 0 -cuv 3 -name ("PFRef_connection_type" + $i)`;

Note this would require you to rework some of your other code which specifies specific vertex indices of the previous cube. Perhaps also consider using transform modification here (translate/scale in this case) rather than moving the verts directly.

---

It's an interesting scenario for sure. I can't explain why the duplicate is such a performance hog here. I didn't test instancing given your comments about speed. I'd still just generate it procedurally rather than rely on a duplicate, particularly since you're going to manually reposition all the verts anyway.

CGTalk Moderation
11-12-2012, 04:26 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.