whats the dotpitch on a t.v?

So… I have a rare opportunity to buy a GIANT monitor(37 inches) for cheap. I plan to use it as a t.v for my living room. I am however worried about the dotpitch (.85) I figure since we’ll be sitting an average of more then 4x away then my .24 monitor it should be ok, but I don’t know.

Anyone know what the dotpitch of a monitor is?

edit… I meant dotpitch of a t.v

A dotpitch is how many phosphor dots per square unit of area. A phosphor dot is the basic unit of the picture: How many dots per square inch (or square millimeter or square yard) determines how finely-grained the picture will be. The more dots, and the denser they are packed, the better the image can be.

"TV sets do not generally have a DOT PITCH specification "

Ok, I know t.v’s generally do not have dot-pitch specs, but they do have pixels. How big are they in relation to a standard crt? Bigger or smaller then .85 mm dotpitch?

or… if anyone happens to own an industrial monitor, I’m looking to buy a nec xm37 37 inch monitor, anyone have personal experience?

A high definition (HD) set can display up to 1080 lines. A standard set can do 480. So if you measure the screen height and divide by the correct number above you should have your answer.

The NTSC* television line standard in the U.S. since 1941 has been 525 lines, not 480 (480 was the British standard from 1936 until the 1960s).

  • National Television Standards Committee

err, doesn’t that mean the bigger the t.v the crappier the resolution? That seems completely wrong to me…

in any case…

lets say 36 diagonal for ease of use…

36² = a² + b²

a=b it might be 4:3 aspect ratio not sure…

… 25.455/525 = .048 inches… x 2.54 = .123 cm x 10 = 1.2315mm a line.

so… with a dotpitch of .85… it should be slightly better then a t.v right?

It sounds like you’re going to buy an old(er) presentation monitor and anticipate getting a kick ass TV monitor in the bargain.

The thing is, in most cases, a new 500 - 500 31/32" flatscreen (CRT) TV set will out perform the older $ 5,000 industrial presentation monitor as a TV set. You’re not going to get the most most amazing TV picture you can imagine by doing this, and it is doubtful the presentation monitor is really set up scanning and resolution wise to handle HDTV output. Giant clunky old presentation monitors are best at just that - PC video output for presentations to a roomful of people.

It sounds like you’re going to buy an old(er) presentation monitor and anticipate getting a kick ass TV monitor in the bargain.

The thing is, in most cases, a new 500 - 700 31/32" flatscreen (CRT) TV set will out perform the older $ 5,000 industrial presentation monitor as a TV set. You’re not going to get the most most amazing TV picture you can imagine by doing this, and it is doubtful the presentation monitor is really set up scanning and resolution wise to handle HDTV output. Giant clunky old presentation monitors are best at just that - PC video output for presentations to a roomful of people.

Correction to my previous post: 405-line television was the British standard from 1936 until the 1960s.

No, the resolution is the same on a big screen and a small screen. The TV signal only has 525 lines of resolution (more for HDTV), so a bigger screen won’t give you a better picture, just a bigger one.

NTSC is 4:3, widescreen HDTV is 16:9.

Because the screen isn’t square, a 36" diagonal screen is only 21.6" high, so the distance between each line is 1.045 mm. (It’ll be slightly higher for a 37" diagonal screen.)

Now, that’s still a lot more than 0.85 mm. Dot pitch refers to the distance between the actual phosphors on the screen, which don’t have to correspond to the lines in the TV signal. Just look at your PC monitor: It has the same number of phosphors whether you’re in a 640x480 or 1280x1024 video mode, which means each pixel is made of more phosphors in 640x480.

So the talk about the number of lines in the TV signal is kind of irrelevant to the dot pitch. Bigger TV sets spread the same number of raster lines over a larger area, but the phosphor dots don’t have to be any farther apart than on a smaller set - there are just more dots.

Also be aware that for NTSC (standard US TV, as opposed to ATSC or HDTV) there isn’t a specific horizontal resolution, and even the vertical resolution isn’t as hard and fast as it may seem. Further, analog TVs do not technically have “pixels” in the strict sense that digital monitors do. “Thus sayeth the Highest Technical Standards, but lesser, yet highly authoritative sources often use the term pixel for convenience.” I won’t debate the point. I’ve seen knowledgeable people, in technical fora war for days on that one. Most usually figure it out in the end, some never do - or never admit it.

You may think any TV will display the number of scanlines in the signal (480, or the 525 lines in an NTSC frame, minus the 45 lines used for vertical sync -i.e. the black bar between frames which now also contains digital data, like closed captioning, sports and stock updates), but not all TVs distinctly display all 480 lines distinctly. We’ll get back to that

Worse, the horizontal scan is an analog signal, varying continuously as the beam scans along a scanline. Depending on how good the electronics are, how tight the mask is, and how precisely the phosphors are painted (that’s an oversimplification), a TV may be able to resolve 720 horizontal lines or more, which would (oddly) be equivalent to 1000 horizontal pixels, if a TV had real pixels. (Why? it’s a theorem of sampling, and has to do with the fact that, in real life, changes in the analog signal never line up precisely with the edge of the phosphors. Therefore you need sqrt(2) more theoretical resolution to consistently view a certain real resolution.)

You probably know how a color TV works: there are three electron guns, each shooting a beam that strikes phosphors of a certain color. A ‘mask’ between the guns and phosphors keeps the edge of the regions illuminated by each gun tidy.

There are different geometries -ways of laying out the guns- which causes different sub-pixel layouts. In a room lit by three bulbs, close to each other, you would have three slightly different shadows. Similarly, the ‘holes’ in the mask cause a slightly different pattern of dots from each of the three electron guns. In some TVs, the mask is a fine set of parallel vertical wires, so the phosphors are drawn in sets of three vertical stripes - one glowing in each color. In others, the guns are in a triangular patter, and the mask is a plate with holes in it, with the phosphors (subpixels) painted in triangular sets.

Let’s look at the “stripe” style of mask and phosphor: a TV maker might decide to put 1000 sets of stripes, but painting 3000 stripes acrosss the screen, with no overlap, and then aligning the mask precisely to it – well you can see how that would be a headache, and make for an expensive tube. (Oversimplification) On the other hand, doing something half as precisely (500 sets) is usually more than twice as cheap. You can also cut corners by not being so meticulous about keeping the phosphors from overlapping or not being as picky about the alignment of the mask, or just using thicker wires in the mask to give yourself more margin of error. These compromises will cause adjacent theoretical ‘dot’ to blur a little (or a lot), or affect the brightness and clarity of each dot.

Similarly, the vertical resolution can be affected if the guns are not as tightly focused, or even overlap slightly. In NTSC, each frame is actually drawn with two subframes - first the “odd-numbered” lines are ‘drawn’ then the ‘even’. This is because back in the Olden Days, they couldn’t build affordable tuners and scan electronics that could cram all 480 lines together at once, in 1/60th of a second (a number based crudely on on the frequency of US AC current - in Europe, they use 50 Hz). Doing the odds and then the evens let the TV have looser tolerances - a spacing of two lines, rather than one. Because of this, many TVs actually have some overlap between adjacent scanlines - the even lines overlap the odd ones, but you don’t notice much because the even lines have dimmed by the time you light up the odd lines, and vice versa.

My point is: actual screen resolution is more complex than you might think. This post, as nitpicky as it seems, is a gross simplification, but the effects can be easily seen with a test signal generator. A signal with fine alternating lines (vertical or horizontal) can reveal the true resolution of a screen. Some TVs (or VCRs) can’t resolve 400 vertical lines across the screen, the best can resolve close to 700. (vertical lines test horizontal resolution, because they are side-by-side horizontally) Failure to resolve lines fully leads to a overall grey, which our eyes accept in most cases without noticing. Similarly, if you create a signal where the odd scanlines are one color and the even scanlines are another (easy to do, since all the odd scanlines are sent together) you can look at the screen, and see the overlap - which can be different for different colors in many cases.

Some TV tube manufacturers define dot pitch as the smallest dot the screen could resolve, if the electron guns were perfect (which they know they aren’t. since the guns are part of the tube they made). Even if this were an honest measure, it would also require a perfectly synchronized signal (or you lose up t0 41% of your resolution right away), a perfect tuner, and perfect drive electronics. None of which you’re going to get.

The many kinds of overlap I’ve mentioned (and several others) are why analog TVs don’t have true pixels in the current technical sense. This also means that, in analog TVs, ‘dot pitch’ is an indicator of the visual sharpness, but not a rigorous measure. In digital monitors, with true pixels, it’s much more definitive.

This is off the top of my head, so I’m sure I said somethings badly, and perhaps even a bit incorrectly. I would welcome correction. It’s been a long time since I was into this stuff.

Actually, not really. I’m a college student and we need a t.v for my apartment. The monitor will be costing me 100 dollars(quite cheap i think for a 37 inch t.v!) and I plan to use it with an asus ntsc tv tuner card(10 bit Conexant CX23880 decoder) in my spare computer. All i’m concerned with is getting a working t.v that is on par with other 3-4 year old t.v’s.

Oh!, ok I get it.

You totally lost me. But in your “expert” opinion, do you think this would make a decent(watchable) t.v?

Tvs have pretty big dots, you could almost measure the pitch yourself with a ruler. Dot pitch is the space between dots, not the dot itself.

Still looking for a definitive answer! personal experiences would suffice as well.


Harmonix, its a weird question but why not call a tv shop & ask?

Good idea Handy, I will, but unfortunately I don’t have high expectations. I somehow doubt the drones at a retail shop would actually know what they’re talking about. I’ll look around anyway.

… does anyone know a number to a TRUE enthusiast shop? Somewhere where the employees would actually know what they’re talking about?

Umm… I think your question already was answered more than once in this thread re the statements that TV’s don’t use a dot pitch basis of measurement, and this info would not be something even a good video tech would know, as even the manufacturers would not be listing or otherwise making this information available as a useful or informative TV display specification regarding how well a TV can display analog video signals.

Actual dot pitch on a TV might well be related to maximum lines of useful resolution the TV can display but this simply isn’t the way TV monitors are specified. HDTV’s have more monitor like specs, but they are not “regular TVs” by any stretch of the imagination.

It’s simply kind of an irrelevant spec to pursue for useful information about regular analog TV picture quality.

There is no point in making the horizontal resolution any better than that of the vertical so I think that is what set manufacturers shoot for. The horizontal resolution could be set, as the dissertaion on tube geometry above shows, within wide limits but better resolution means higher cost. Why spend money on horizontal resolution when the vertical resolution is set by the standards and your picture can’t be a whole lot better than that anyway?