Whatever happened with cryogenic circuits?

I have an old Time-Life book on chemistry, obviously aimed at laypeople, which was published in 1963, and revised a few years later. In a picture spread about the behavior of liquified gases, there was a bit about how you could dunk just about any metal into liquid helium, and it would become an incredibly strong magnet. Similarly, if you cooled circuits to cryogenic temperatures, the conducting metal would lose all resistance to electricity. So they would be ideal for computers, and as a demonstration of this cutting edge technology, there was a photo of a “cryogenic memory plane”, which was “not much bigger than the straight pins alongside it”, that could hold…wait for it…240 bits of memory.

But as I read it, I always wondered if the cost of keeping the helium liquified cancels out any benefit to the magnet or the circuit, and the book said nothing about that. Nor have I seen the concept discussed anywhere else, or since. So, whatever became of this area of research?

You are talking about superconductivity – the process of immersing substances in cryogenic material for the purpose of reducing/eliminating resistance to electricity.

In my field, cryogenic circuits are used for the reduction of thermal noise – vibrations of electrons caused by ambient temperature changes that affect the purity of a sine wave. They are often used on sensitive recievers and radio telescopes.

There are certain ceramics which become superconductors when cooled in liquid nitrogen, which is much warmer than liquid helium. Problem is, ceramics aren’t exactly well suited to making wires, so their application is limited.

Superconductors are more common than they used to be, but they’re still rare and expensive. They are used in particle accelerators to create enormously powerful magnetic fields, for example.

Specifically, what they thought would be the wave of the future was Josephson Junctions: devices that utilized a property of superconductivity to make very fast, very low power switching elements. They still would be a neat idea, if cheap room-temperature superconductivity was available, and you could match the incredible transistor density that’s been achieved with photoengraving doped silicon. You could do a book on proposed computer technology that didn’t work out; try Googling for Magnetic Bubble Memory for instance.

Go figure. In 1963 we thought we’d have a permanent base on the Moon by now.

They’re also used to create the very powerful magnetic fields used by MRI machines.

Cryogenic circuits are still around, but they are only cost-effective in niche applications where the utmost performance is required, regardless of price.

A big factor is that the performance and cost effectiveness of room temperature electronics have improved so quickly that it just didn’t make economic sense to either run cryogenic superconductors or pursue room temperature superconductors for consumer electronics. (Yeah… in other words, what Lumpy said.)

I frequently work with cryogenic electronics for optical and infrared sensing… due to the physics of optical detectors, the only way to get good sensitivity and reduce background noise is to cool the electronics to the point where you don’t go blind just from the thermal background liberation of electrons into your sensor material’s conduction band.

And with infrared sensors, if your sensor and optics are not cool, you’ll be blinded by your own long wave IR blackbody emissions.

I also used to work with Josephson Junctions and SQUIDs, which are another kind of circuit that has only found niche applications so far.