TV underscan/overscan

I apologize if I describe this problem poorly. My knowledge of display technology is very limited, but here goes.

Often when I connect a laptop to some monitor or TV via HDMI and set the resolution to 1920x1080 (or whatever the display states is its native res), the image appears too large or too small for the display (e.g., the edges are cut off, or there’s a black strip going around the entire screen). So I’d have to go into the display properties on the PC and fiddle with the underscan or overscan setting until the image matches the display’s screen size correctly. In one case, the configuration software for the ATI card that was in the laptop didn’t even have a way to adjust the scan, so I had to live with it. (They added the option in a later version of the driver software, thankfully.)

However, I can take the same monitor/TV and plug a cable or FIOS set top box into it (also via HDMI), and the size of the image matches up perfectly with the display without having to make any manual adjustments of any kind.

So in layman’s terms, can anyone explain why PC’s have this problem but not set top boxes? The TV/monitors I’m talking about are relatively modern, e.g. a 60" sharp aquos LED.

[I did find a few threads here that touch on overscan but none of them seemed to explain why STB’s never have this issue.]