Computer Memory

This probably sounds like a naive question to any computer gurus, but I can’t figure it out with my modest computer knowledge.

My PC has 128 MB RAM. When I look at the system properties under Windows NT it says I have 130,484 KB. Now I know that 1K is not exactly 1000 bytes, but I can’s see how 128MB is 130,484 KB.

Can anyone help?

This is basic computer nuts and bolts. Since computers are binary inside, 10 is in fact not a nice round number. 2,4,8,16,32, etc. are. It’s called the power of two series. 2 to the 10th power is 1024. Since that’s close to a thousand, and a “round” number as far as y our CPU is concerned we call that a kilobyte. 2 to the 20th is 1048576, close enough to one million to call that a megabyte.

I posted too fast and made the faux pas of quoting your whole post. That number isn’t the equivalent of 128 megabytes either. I’m not at work so I don’t have an NT machine handy to compare. It could be quirk (it’s not a bug, it’s a feature) in NT or something unique in your hardware that causes the 588kb discrepancy. 128 megabytes should be 131,072kb.

Yes, I understand that 1 MB is 1,048,576 bytes or 1,024 KB but what I don’t get is that 128 x 1,024 = 131,072 not 130,484. Sorry if I didn’t phrase the question very well.

Accept my mea culpa along with this this laurel…and hearty handshake as a welcome to SDMB.

Sorry Padeye. Now I’ve gone and posted too slow to see your second response.

Whoops, I did it again. :slight_smile:

Just a guess…I think it’s subtracting what the OS uses as “conventional” RAM…subtract 640K (the conventional RAM amount), and you’re darn close to what you get…That and the OS probably rounds what it finds.

Jman

I have no answer, but I run a dual boot 98/2K machine. I just booted to Win98 and it happily tells me I have 128MB.

However, Win2K is telling me I have 130,552K. So MrWhy is missing 588K in NT(4?) and I am missing 520K in 2000.

Jman, not a bad guess but I don’t think the NT kernel even utilizes the conventional/extended memory model.

I agree with Padeye, this must be a cough undocumented feature of the NT kernel.

A very quick search of Knowledge Base yeilded no answers either. Not that they’re not there, but not easily found.

I pull up the little real-time memory totals tab in KDE under Linux.

I too, thought I had 128MB, btw.
Total memory 130920448 bytes = 124
Free memory 8257536 bytes = 7*
Shared memory 47345664 bytes = 45
Buffer memory 9416704 bytes = 8

So. Anyone know why I only have 124MB? Even rounded down to the nearest integer?

*yes, this seems like very little free, but linux is a pretty good memory manager. Usually all available memory seems to get allocated, and in an efficient manner.

Kyberneticist, I know exactly nothing about Linux, but do you have a mobo with onboard video using 4MB? Or perhaps there is some utility in Linux that will reserve physical memory like SmartDrive in WIN/DOS O/S?

I’ll have to check around for cites on this but IIRC memory manufacturers mess around with this and is why different people can get different numbers.

When you buy 128MB of RAM the manufacturer guarantees that at least 128MB (or 131,072kb) worth of transistors exist on their chip(s). However, in the millions of teeny transistors on your memory chip, it is common for some to not work. This is ok as the memory will know which parts to avoid and will function normally. I imagine there is an error margin the industry accepts as still valid for sale at an advertised memory size.

This same effect also happens on computer harddrives. Almost all harddrives conatin errors from manufacture that reduce total capacity. This is normal and something most people are never aware of as the harddrive also knows to avoid the ‘bad’ spots. Manufacturers have the error rate nailed down pretty well and build in drive capacity slightly larger than the advertised size so when they lose some capacity through these errors they are still on target for claimed capacity.

FYI: My computer (NT) reports 130,468Kb of RAM on a stated 128MB system.

MrWhy, same thing with your hard drive.

Jeff_42 - I don’t think the bad sector analogy can apply to memory as with disk drives. Drive space is allocated by 512 byte blocks which can easily be marked as bad and not used. Memory is accessed as an array to allow byte level allocation. Modules have to be 100% good to work at all.

FWIW my *^%#@ NT 4 machine at work has 96mb of ram but reports 97,712kb, a 392k shortfall.