I just got away from a good session of AOLtech fun. It turns out that system response only depends on percentage of resources left. So that 85% of 32MB is better than 50% of 512MB.
My questions for you SDMB demi-gods are these:
Does any of this make sense?
What generates a 504 timeout?
What in the wide, wide world of sports is memory leakage?
Memory leakage… Generally when you start up a program it will use memory for itself and when you close the program it will release the memory back to the computer to be use for something else. Poorly programmed apps will sometimes not release the memory back to the computer. So if you have a program that takes 4 megs and never gives it back to the computer and you keep opening it and closing it you will gradually use up all your memory. And it doesn’t have to be a program you run all the time, don’t forget that if you don’t turn your computer off and you run this program 3 times a week, after a month you tied up 48 megs for no reason.
The system resources that you’re talking about are independent of RAM. What that 50% is telling you is the amount of space that you have left on the heap, which is one particular place that your computer allocates memory from. The heap size is fixed, so no matter how much RAM you have, system resources can constrain you. AFAIK, every operating system runs that way.
A 504 timeout occurs when the server on the other end of an internet connection takes too long to respond to your request.
What Joey P said. I’m not 100% sure about this, but I believe in some cases, the system can recover from a memory leak. It all does come down to poor programming, though, so I’m very careful to avoid memory leaks when I code.
By poor programming do you mean not including or properly defining deconstructors? (or the other non-C++ equivalent) What else constitutes poor programming where memory is concerned?
A memory leak in C++ is caused by allocating an object with the new operator and failing to deallocate it with the delete operator (or the corresponding operators for arrays). Same type of thing in other languages.
Kinda, I’m not sure if C++ would handle stuff nicely if you didn’t define a deconstructor – I don’t know much about C++. The problem is making sure the deconstructor is called at all.
At the very basic level things make a call to the operating system saying ‘give me some memory’ and then another call when they’re finished saying ‘here have it back’.
This gets allocated, IIRC, from the heap. This is what uses your resources.
Two things cause memory leaks, either forgetting to free the memory, destroy created objects etc. Or your program crashing out before it has a chance to free it.
Some operating systems handle these situations better than others and can reclaim memory, and some languages --I’m thinking Java here-- deal with it all for you. ( Presuming the virtual machine itself doesn’t fall over and leave it’s memory reserved ).
Some windows programs are particularly bad for this, but a lot of the time low system resources is simply due to the amount of stuff running at one time.
Close down as many programs as possible, if you still have low resources then reboot. Also watch for programs that run on startup that using resources, a lot of things like to start automatically just to ‘be there’ when they don’t really need to.
The other poor programming where memory is concerned is addressing memory you don’t have reserved, i.e. addressing an array past it’s declared boundary, which tends to be a common mistake, by me at least.
Deconstructors are called automatically when the scope of an object expires. If you don’t write your own deconstructor, the compiler provides one for you, that basically just calls the deconstructor of every normally-allocated variable you have.
**
It only comes from the heap if you use new or malloc() or anything like that. Otherwise, it comes from the stack, which the OS can handle on its own, as the scope of stack variables is clearly defined.