February 13th, 2013, 12:28 AM
How best to prevent memory exhaustion in time-critical dynamic programming?
I am trying to solve a knapsack problem in a time-critical application. I've found I can shave a couple dozen seconds off of some inputs with dynamic programming. The problem is, the amount of memory consumed by the "memoized" (<http://en.wikipedia.org/wiki/Memoization>) data is so large in many cases that the performance begins to drop on some inputs and the program crashes due to memory exhaustion (or maybe fragmentation???) on others. I'm quite sure I'm not leaking memory.
I have modified my code so that I can put a hard limit on the amount of data the code tries to memoize, but it doesn't work consistently. I've found that a hard-coded ceiling on some inputs is too large while on others it could be larger (due to several other factors which impact memory consumption and involve the way the data is processed). Plus, I want to take advantage of additional RAM on computers which have it.
In effect, I need a function which can tell me the amount of readily available memory. I've searched the web and have found little other than multitudinous cautions that such an idea is a bad one under modern operating systems. So my first question is, if that's true, then
1.) what should I be doing instead?
Assuming this is the least of the evils for my situation, I'm trying to use GlobalMemoryStatusEx. I take the smaller of the ullAvailPhys and ullAvailVirtual members, remove about 1GB for use by other processes, and then multiply the result by a factor less than one for a safety margin. My second question is,
2.) am I even looking at the correct fields in the MEMORYSTATUSEX structure? (The last time I deeply understood how OS's and processors handled memory was on a computer with a Motorola 68000 processor, so I'm having to learn how modern OS's assign/page/virtualize/commit/whatever memory.)
Click Here to Expand Forum to Full Width