Didn’t want to hijack this thread, so I thought I’d ask here.
You see TV SD aspect ratio as 4:3
You see TV HD aspect ratio as 16:9
Using lowest common denominator, 16:9 results in (ahem) 4:3!
This is something that’s always bugged me - I know there’s a difference, but it would seem to me that it’s more something like 4:3 vs 7:3 or something like that. In other words, it looks to me that 4:3 would EQUAL 16:9.
Four divided by three gives a ratio of ~1.33 pixels on the horizontal axis for every 1 pixel on the vertical. Sixteen divided by nine gives you ~1.77 pixels on the horizontal for every 1 on the vertical.
If you’re thinking of them as fractions, 16/9 and 4/3, the least common denominator would be 9, so with 4/3 you multiply both the numerator and denominator by 3 and get 12/9 and can easily see that 12/9 does not equal 16/9.
You don’t reduce fractions (or ratios) by taking the square root of both elements. That’s not what lowest common denominator means.
4:3 means that for every 4 units across, the screen is 3 units high. You could also express that as 1.333 etc horizontal units for every 1 vertical units.
16:9 means that for every 16 units across, the screen is 3 units high, which means for every 1.77 etc horizontal units there is 1 vertical unit.
As an aside, way back when I was shopping for an HDTV and realized the major screen size ratio difference between them when you only measure the diagonal screen length I made a quick Excel sheet to compute the actual heights & widths of the two based on their diagonal lengths:
When you do the math you see that although a 32" screen was large for an SD tube, a 32" HDTV is only 16" high by 28" wide (the height is always much lower).
This is one reason why I still use a 4:3 monitor (a CRT nonetheless) and not a fancy HD monitor, which are really only better if you watch HD movies (of course, depending on what you do).