A question about early color television

Back in the 1960s, I was driving a lorry for a well-known company called Radio Rentals. At this time, their main business was TV rental and most people rented rather than bought what was a very expensive item. The big boost for B&W TV in the UK was the Coronation on 2 June 1953.

I recall that the first colour TVs, which were big and very heavy, took two of us to carry one. All the RR shops had one in their showrooms and we delivered a few to schools. It was quite a while before they became commonplace in private houses.

I remember that, also. But I think I was told it was bad for my eyes.

– another thing about early color televisions was that the color reproduction was often very much not good. Some people probably feared wasting their money on something that made people’s faces all look like they were horribly ill, and so on.

The ad seems to have been talking both about interference and about quality of picture; so the adwriters may well also have been thinking of that factor.

The ad also mentions magnetic fields (that is, just magnetic, not electromagnetic). A steady magnetic field always influences cathode ray tube images, because it causes electrons to travel in circular trajectories rather than straight lines. On a black and white TV, this would appear as slight geometric distortion, and with a strong magnet it’s clearly visible, but with weak fields the image on an already curved screen would distort little enough that we didn’t notice.
Enter color television, which in those days used three different electron guns that were aimed slightly differently, and a mask (a metal screen with fine and very carefully placed holes), and a phosphor coating that was a pattern of red, green and blue phosphors. A magnet would distort this setup too, but with the new consequence that the colors would be off, because the green channel gun (for example) would also hit some of the red phosphor and some of the blue, and miss a bit of its own green. We are very good at perceiving local color shifts, compared to our ability to perceive slight geometric distortions.

My mother had a sewing machine, made in the early '50s, that wasn’t suppressed like later consumer goods had to be, and when the motor ran (upstairs) you certainly noticed it on the set (downstairs). She never noticed, as she didnt sew and watch TV, until we got a smaller set in the sewing room.

You were lucky! We only had one black-and-white TV in the entire state! Every night we’d all travel to the state capital by mule, all 2.3 million of us, and some of us would be so far back we had to watch by telescope and rely on smoke signals to tell what Lucy was saying! But we were happy.

Don’t forget that they were broadcasting in “compatible color”. That meant it had to look good on both B&W as well as color sets. They couldn’t fine tune so to speak for only the color sets.

The difference between compatible colour TV encoding and modern digital TV is not that great.
The key thing being a realisation that we have much poorer spatial resolution of colour than light dark. So colour TV split the chroma signal out into a sub-band with less bandwidth than the luminance. This is a predecessor to perceptual coding in a primitive way.

The lower bandwidth did have downsides that were visible. But there were lots of compromises so it wasn’t huge in comparison.

Modern codecs similarly drop the information content in the chroma. The most obvious manifestation is in 4:2:2, or worse, chroma sub-sampling.

And that’s why we were the last family on our block to get colour television. Mom was dreadfully worried that we’d get “radiation” from the TV. She was also dreadfully worried that we’d damage our eyes if we sat too close to the B&W TV.

Between those two, it’s a wonder that I ever got to see TV at all, when I was a kid.

I still occasionally have to do that with my FM radio aerial under certain cloud conditions - or when the refuse collection truck comes round.

As others mention, older motors were noisy - basically, brushes that made and broke the connection on the rotor meant abrupt “on-off” cycles, creating noise. The mechanical distributor on old automobiles (remeber adjusting mechanical timing? I tried to do that once or twice) also created that on-off. This resulted in static, and the video portion of the signal for TV was AM, so very susceptible to picking up local static.

(Handy hint - if you have a loose connection in the house wiring in one of your outlets - which could cause a fire - you could detect it by waving an AM radio near it. If the static got louder closer to the outlet, you should check the connection. Same idea - electrical connection going off and on creates electromagnetic “noise”.)

You could really mess up a colour TV by putting a magnet near the screen. The motiion of the magnet would magnetize the mask behind the screen, deflecting the electrons so that part of the screen the electrons hit the wrong colour phosphor dots. The solution was to try to “comb” out the magnetism… so I’ve heard.

This was why those original home computers and games (Commodore 64, Atari) were limited to about 40 characters across - the TV video processing did not have the bankdwidth to handle 80 characters, like a dedicated computer monitor. Even so, colour (chroma) bandwith was so bad there was always some bleed on the characters. This was based on the logical observation that most colour pictures consisted of large blobs of the same colour, and the detail was more the brightness contrast. This gave an acceptable colour addition to existing B&W broadcast technology.

A TV technician’s toolbox always carried a degaussing wand. Basically an AC powered electromagnet with a handle. You waved it near the TV tube and slowly retreated away, with the AC field driving the magnetised shadow mask up through its magnetisation curve - and hopefully in ever decreasing swings, eventually leaving it in an unmagnetised state when you got far enough away. TVs had their own de-gaussing coil wound around the tube, it was little more than a few turns of wire and a thermistor. The thermistor heated up and ramped the current down over a second or so. OK for fixing minor magnetisation, but if a kid decided to play with a magnet and the TV, you needed additional help.

I have heard of people trying lots and lots of power cycles of the TV to try and fix magnetisation issues, but with mediocre success.

RFI was an issue with cars and trucks. That’s when “resistor” spark plugs were introduced in the 1950s - as well as RFI suppressor spark plug wires. The earlier spark plug wires were just that, and radiated RFI far and wide. The interference was enough to wipe out reception for a considerable distance. Various fixes were necessary, to both prevent messing with TV reception, and allow car radios to operate as well. Lots of grounding straps for the car hood, and capacitors at the ignition coil and generator output. DC generators like to put out a lot of noise in the radio range as well.

Not only did the colorcast have to be compatible with black and white, but, especially with NBC in the early years, the networks wanted the colors to be really saturated so the viewers with color TVs would be happy they got their money’s worth.

We had long bar magnets for demagnetizing color CRTs. It was mainly for diagnosis, to make sure it was stray magnetism bending the electron beam and not a malfunction in the voltage ramp. We were working with higher resolution tubes than used in TVs so alignment was even more sensitive. ‘Wiping’ the screens never seemed to work.

The use of alternators in cars cut down on a lot of common EMI noise. Late 70s EMI shielding regulations became active reducing the interference created by and affecting a lot of electronic equipment.

My mother bought our first color TV in 1969 after the moon landing. The TV had issues with interference so my mother went back to where she bought the TV to complain. The salesman sold her an RF filter for 99 cents, she came home and hooked it up. The problem went away.

In he days of monochrome computer monitors you could us a magnet to distort the text on the screen, Remove the magnet and it’s all good. Discovered at work one time it worked with color monitors as well. But it was permanent.

I seem to recall that some TVs did a degass when they were first turned on. Does that sound right? Upon further reading I see the answer.

I recall that unscrupulous repair shops had a profitable scam.

After a year or two, CRT TVs would start to look faded so people would call in an engineer. He would take the front off and clean the dust from the front of the tube and the glass screen. Then charge for fitting a recon CRT.

The “yoke” around the neck of the tube on a TV (remember those?) was a pair of electromagnets powered by sawtooth signals, that deflected the stream of electrons. Between the horizonal sweep (approx 15,000Hz) and vertical sweep (60HZ) it would paint the whole screen (twice, interlaced, 480 lines). Put a magentic field near enough and it would deflect the path of electrons an additional amount. For monochrome monitors or B&W TV, this was simply affecting the path. As mentioned above, colour TV was 3 separate, slghtly differently aligned electron beams that wen through a mask to hit separate coloured phosphor dots. Magnetize the mask, by moving a magnet near the metal mask, and those finely tuned directional beams are deflected and don’t hit the right target. Computer monitors of the day were the same as colour TV’s but with better bandwidth electronics, and later, with more pixels (dots).

Technology is full of examples of taking a simple concept and embellishing it beyond all imagination.

Fun fact - to get the sawtooth deflection signal with sufficient power to deflect the electrons required high voltage (the greater the tube angle, the higher the voltage) so the “flyback transformer” could generate up to 30,000 volts. Do NOT poke around in a live TV,

Step-up voltages are pretty neat. The standard DOE field survey meters - yellow geiger counters that everybody had in the 1950s was pretty ingenious. Intended for CD distribution, they were powered by four (4) easily sourced “D” flashlight cells. 2 for the (analog) meter itself, and just 2 cells for the unit itself - 3 volts DC that was turned into thousands of (low current) volts to power the vacuum tube geiger tube - plate and filament voltage. And make the audible “clicks” to scare everybody.

You could just hear it whine like a mosquito when running, and would definitely “bite” if you started poking around in thar under the hood. Ask me how I know!

You didn’t need a high voltage for beam deflection, rather the horizontal scan did dual duty, one driving the horizontal magnets, the other driving the step up transformer that - usually via a trippler - got you the anode voltage for the tube. With enough voltage to turn the unwary into a raisin. The current needed to drive the horizontal scan was pretty beefy. I remember that cheap fast transistors useful for audio amplifiers were available due to the market for them in TV drive circuits.
This was a great example of how TVs were built with minimum parts. When a production run of a single model could be a million sets, saving a single resistor could justify an engineer’s entire salary.