Also be aware that for NTSC (standard US TV, as opposed to ATSC or HDTV) there isn’t a specific horizontal resolution, and even the vertical resolution isn’t as hard and fast as it may seem. Further, analog TVs do not technically have “pixels” in the strict sense that digital monitors do. “Thus sayeth the Highest Technical Standards, but lesser, yet highly authoritative sources often use the term pixel for convenience.” I won’t debate the point. I’ve seen knowledgeable people, in technical fora war for days on that one. Most usually figure it out in the end, some never do - or never admit it.
You may think any TV will display the number of scanlines in the signal (480, or the 525 lines in an NTSC frame, minus the 45 lines used for vertical sync -i.e. the black bar between frames which now also contains digital data, like closed captioning, sports and stock updates), but not all TVs distinctly display all 480 lines distinctly. We’ll get back to that
Worse, the horizontal scan is an analog signal, varying continuously as the beam scans along a scanline. Depending on how good the electronics are, how tight the mask is, and how precisely the phosphors are painted (that’s an oversimplification), a TV may be able to resolve 720 horizontal lines or more, which would (oddly) be equivalent to 1000 horizontal pixels, if a TV had real pixels. (Why? it’s a theorem of sampling, and has to do with the fact that, in real life, changes in the analog signal never line up precisely with the edge of the phosphors. Therefore you need sqrt(2) more theoretical resolution to consistently view a certain real resolution.)
You probably know how a color TV works: there are three electron guns, each shooting a beam that strikes phosphors of a certain color. A ‘mask’ between the guns and phosphors keeps the edge of the regions illuminated by each gun tidy.
There are different geometries -ways of laying out the guns- which causes different sub-pixel layouts. In a room lit by three bulbs, close to each other, you would have three slightly different shadows. Similarly, the ‘holes’ in the mask cause a slightly different pattern of dots from each of the three electron guns. In some TVs, the mask is a fine set of parallel vertical wires, so the phosphors are drawn in sets of three vertical stripes - one glowing in each color. In others, the guns are in a triangular patter, and the mask is a plate with holes in it, with the phosphors (subpixels) painted in triangular sets.
Let’s look at the “stripe” style of mask and phosphor: a TV maker might decide to put 1000 sets of stripes, but painting 3000 stripes acrosss the screen, with no overlap, and then aligning the mask precisely to it – well you can see how that would be a headache, and make for an expensive tube. (Oversimplification) On the other hand, doing something half as precisely (500 sets) is usually more than twice as cheap. You can also cut corners by not being so meticulous about keeping the phosphors from overlapping or not being as picky about the alignment of the mask, or just using thicker wires in the mask to give yourself more margin of error. These compromises will cause adjacent theoretical ‘dot’ to blur a little (or a lot), or affect the brightness and clarity of each dot.
Similarly, the vertical resolution can be affected if the guns are not as tightly focused, or even overlap slightly. In NTSC, each frame is actually drawn with two subframes - first the “odd-numbered” lines are ‘drawn’ then the ‘even’. This is because back in the Olden Days, they couldn’t build affordable tuners and scan electronics that could cram all 480 lines together at once, in 1/60th of a second (a number based crudely on on the frequency of US AC current - in Europe, they use 50 Hz). Doing the odds and then the evens let the TV have looser tolerances - a spacing of two lines, rather than one. Because of this, many TVs actually have some overlap between adjacent scanlines - the even lines overlap the odd ones, but you don’t notice much because the even lines have dimmed by the time you light up the odd lines, and vice versa.
My point is: actual screen resolution is more complex than you might think. This post, as nitpicky as it seems, is a gross simplification, but the effects can be easily seen with a test signal generator. A signal with fine alternating lines (vertical or horizontal) can reveal the true resolution of a screen. Some TVs (or VCRs) can’t resolve 400 vertical lines across the screen, the best can resolve close to 700. (vertical lines test horizontal resolution, because they are side-by-side horizontally) Failure to resolve lines fully leads to a overall grey, which our eyes accept in most cases without noticing. Similarly, if you create a signal where the odd scanlines are one color and the even scanlines are another (easy to do, since all the odd scanlines are sent together) you can look at the screen, and see the overlap - which can be different for different colors in many cases.
Some TV tube manufacturers define dot pitch as the smallest dot the screen could resolve, if the electron guns were perfect (which they know they aren’t. since the guns are part of the tube they made). Even if this were an honest measure, it would also require a perfectly synchronized signal (or you lose up t0 41% of your resolution right away), a perfect tuner, and perfect drive electronics. None of which you’re going to get.
The many kinds of overlap I’ve mentioned (and several others) are why analog TVs don’t have true pixels in the current technical sense. This also means that, in analog TVs, ‘dot pitch’ is an indicator of the visual sharpness, but not a rigorous measure. In digital monitors, with true pixels, it’s much more definitive.
This is off the top of my head, so I’m sure I said somethings badly, and perhaps even a bit incorrectly. I would welcome correction. It’s been a long time since I was into this stuff.