PDA

View Full Version : Win 7 x64 hangs on too big memory allocations


zoharl
07-24-2012, 06:06 PM
instead of crashing the app ... :banghead:

Recently I switched to building my apps in x64 (it took me quite a while to rebuild all the libraries I use for x64). Why? Out of boredom, to show off, feel like top of the line... I have no idea, since my plugins never needed more than 2GB memory.

To my problem. I found quite a nasty bug in win 7 x64, which quite clouds IMO developing in x64. When you try to allocate a ridiculous amount of memory, which your system can never dream to handle (8GB ram + 8GB page file), instead of crashing the application, windows tries to allocate the memory and the system simply hangs, and a hard reboot is needed.

Why do I allocate such a ridiculous amount of memory from the first place? Due to a bug. I resized an array to be of size N, and I forgot to set N. Stupid mistake which in a debug mode a x32 app would crash immediately and show you the faulty line. In x64 after three hard reboots, hoping that my poor hd doesn't get bad sectors, and waiting 10 minutes each time for my system to come up with the whole development environment, I simply decided to go back to x32 just to locate this bug.

BTW the same thing happens if I try in matlab x64 to define a ridiculous size of matrix. Did the computer lost his sense of humor? Actually it's not that ridiculous when you work with voxels, i.e. 3D matrices. You tend to forget that 1000^3 is quite large although you think that you need this kind of resolution.

My initial googling discovered that Microsoft doesn't see it as a bug. A few suggested solutions:

http://connectppe.microsoft.com/VisualStudio/feedback/details/731787/large-memory-allocation-using-new-hangs-system
http://social.msdn.microsoft.com/Forums/en-US/vcgeneral/thread/90f6a32e-0a4a-401f-b265-5517d40ab097/
http://stackoverflow.com/questions/98098/vs2005-limit-the-heap-size
http://stackoverflow.com/questions/5578008/visual-c-possible-to-limit-heap-size

So we can tell the linker to limit the heap size for example with the /LARGEADDRESSAWARE:NO. The problem is that it won't work on .mll (.dll), since maya exe determines the heap size. Also it won't work with matlab scripts, since matlab.exe determines the heap size. So the only option is to set some global hook on the heap alloc, preferably for the whole system... Might degrade performance, or might be negligible.

What do you think?
How do you handle this problem?

InfernalDarkness
07-24-2012, 06:35 PM
How much RAM are you working with, Zoharl, in that machine? 16GB isn't really a ridiculous amount of RAM these days, but as you've said you generally don't need that much. I run with 8GB but still run into limits in certain complex scenes, but for my arch/viz work it's plenty, for example. Does your machine have at least 16GB?

zoharl
07-24-2012, 07:24 PM
Sorry for not being clear. I meant in the parenthesis that I have 8GB ram, with 8GB page file. I don't actually checked how much the plugin tried to allocate.

@Infernal, I'm not sure how can you alloc such a size in a standard app (without code involved), although for you it might be easy. I assume you are using win 7 pro x64. Please try to find how your os behaves when it's over the limit. For example create a fluid container in maya of size 1000^3. (But save your work before experimenting...). Maybe even a large python/mel array would do the trick. I'm expecting the app to throw an error and not to hang the system.

You should all agree with me that it's a windows bug right? Windows shouldn't hang because of an app mistake, or because of a ridiculous size alloc. So I think the solution should be for the OS, somehow to restrict the maximum memory an application can allocate.

I'm reluctant to experiment, I'm still surface testing my hd. What do you think about disabling the page file?

InfernalDarkness
07-24-2012, 07:57 PM
A 1000^3 fluid container does not crash or freeze my workstations, although Maya is so unresponsive as to be almost useless at this resolution. This is likely my GPU, I'll test it for you when I get home to compare the GTX550Ti to the GTS250 and we'll know more. This is also a known "problem" with the single-threaded Dependency Graph in Maya itself.

You should all agree with me that it's a windows bug right? Windows shouldn't hang because of an app mistake, or because of a ridiculous size alloc. So I think the solution should be for the OS, somehow to restrict the maximum memory an application can allocate.

I don't think we should all agree with you, as these issues don't happen on my 1100T or my FX8120, currently. Also, I don't use Win7 Pro x64, but Win7 Ultimate x64. I rarely use a page file and almost never for Maya, as most of my scenes weigh in under 4GB, but I have had problems rendering scenes over 8GB obviously.

The OS does restrict the maximum memory already, however. 192GB for my operating system, and the same for Pro. It's plenty for my purposes! But if you want to tell your OS to restrict a certain application selectively, you can use one of many tools to do just that. I use ProcessLasso, myself, which will also allocate per-application affinity on startup, kill-on-startup, flush cache per application, and other features to allow greater flexibility and control. There are other programs which do similar things, ProcessLasso is just my personal favorite.

zoharl
07-24-2012, 09:49 PM
Okay, try 2000^3, 3000^3 and keep going until something happens (crash, hangs, error message of can't be done)...

I don't want to set anything specific for an app, and I think it's the OS job to manage tasks (I mean you pay X for a monstrous OS, and then buy some small util at 0.1*X to do the job...). I'm looking for a global restriction for all apps not to go over some memory threshold to prevent my bug.

Hardly using page file means that your page file is disabled, or that your lasso program enforces this?

InfernalDarkness
07-24-2012, 09:59 PM
Yep, Zoharl, at 2K fluid 3D container Windows dropped to 80MB free RAM usage and froze outright, even with ProcessLasso in effect. I had to manually restart the machine.

I'm not sure how to approach this from a global OS standpoint though - because in this case it's not necessarily the OS's job to handle memory limiting, but the application's job to NOT tell your hardware to "bugger off" and die. Say we had 32GB or 64GB of RAM - would Maya's fluid container test do the same thing? I doubt it. So in this case, at least, it's Maya's "fault" and not Windows' fault. That may not be the case with the programming you're working with though, my friend, because in other scenes which easily pass my 8GB physical RAM, it does NOT crash Windows, Maya, or freeze up, but begins swapping to HD usage of course, as it's supposed to.

So for this fluid container test, it's evidently not swapping to virtual memory nor using the available hardware properly. In that instance, it's Maya's fault, as Windows doesn't know about Maya directly and can't tell it to "behave"? Make sense?

It was nice to finally crash this thing, though!

zoharl
07-24-2012, 10:36 PM
So... Let me get it straight. Any buggy app that Zohar wrote might crash the system with some bogus request which the OS know it's bogus (8+8 is the limit in my case, and above that it's bogus), and it's not the OS fault, the OS is great, stable and can handle anything?? :curious:
What next, accidentally I wrote to another process memory space, and accidentally crashed it?
Win95 here we come...
No, I have a better one, I sent the hard disk head to a bogus address and now it's damaged...

BTW, disabling the page file didn't work?

InfernalDarkness
07-24-2012, 11:07 PM
Ahh, I see now, but should have already known since you're the Complainer-In-Chief around these parts. You're promoting an anti-Windows agenda. Well that's really easy to fix, Zoharl.

Go get a Mac. (shrugs) Then everything will "just work".


It's not the fault of Microsoft or Windows that you don't know how to program for 64-bit Windows, or that Autodesk wrote in a function which in this case causes a massive memory-leak and crashes Windows. How could Microsoft know that you would be so inept at programming, or that Autodesk was giving their application too much flexibility?

The fault is yours, son. Now go pay Apple like a good consumer and forget about it.

zoharl
07-25-2012, 09:43 AM
Ah @infernal, cheery as always ;)

Are you serious?? I hope you got a good night sleep and reconsidered this. If not consider the following. We have a server in the university that serves a lot of people. Some Zohar develops his code on the server, and what can I say, the kid isn't perfect. Oops, the server has crashed, who's fault is it? Zohar! Why does he do these mistakes that kills everyone server... Ho, right, who would in their right mind use a windows server to serve multiple users. Okay, if that's the case, then it's no biggie to have such bugs in the os, and blame maya for them.

The user asks of maya something, maya asks it from the OS. Maya doesn't know (except in admin mode) which processes run, how much they consume, how many resources can be allocated. Whose job is that? Right, the os.

Concerning mac, I never touch this *hit, and from rumors I never will. I like my freedom. If I wasn't playing computer games, I would have moved to debian (linux) long ago, replace maya with blender (how the hell did I choose an app which isn't open source?!), and fix by myself any stupid problem that I have. Instead of writing lame patches, I'll just go and fix the source. Dreams...

CGTalk Moderation
07-25-2012, 09:43 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.