Alternatives to classical computing

I’ve heard of quantum computing and DNA based computing, so what are the most promising concepts that may usurp classical computing?

Quantum computing is certainly a thing. I have not heard of DNA computing (although I have heard of DNA storage…storing data on DNA).

While quantum computing is cool and has some important uses it will not replace classical computing for a loooong time (if ever). It is really good at some particular tasks and pretty bad at most computing tasks.

I do not see a real alternative to classical computing on any horizon.

But…I am not an expert so maybe there is.

DNA computing is basically encoding a particular problem in strands of DNA, then using the natural strand-matching ability of DNA to solve the problem. It’s a clever idea, but like quantum computing, it really only solves a certain set of problems. I don’t see either one as “replacing” conventional computing.

Analogue Computing is a possibility (not a strong possibility IMO), but people are working on it.

There’s nothing wrong with analog computers. They did make digital computers. To do things they couldn’t do as well.

Just a note, by the way: “Analog” computing is not the same thing as “analogue” computing. “Analog computing” means that instead of signals just being discrete values (1 or 0 in a binary computer), they can be any of a continuum of values. So you can have your equivalents of transistors all having different threshold values, and outputting different values depending on the values of your inputs, but you’d still have the equivalent of circuits, and logic gates, and values in some circuits affecting values in others, and you can program them.

“Analogue computing”, by contrast, means that you build some physical system that’s analogous to the system you actually want to study. For instance, you can model an economy with a variety of reservoirs and tubes connecting them, such that the flow of water between them is an analogue to the flow of money. Or contrive for a fluid to behave in some way similarly to how spacetime behaves in general relativity, so you can model a black hole. There’s a lot more variety possible here, but an analogue computer has to be completely built for the specific problem it’s intended to model, and any new problem requires designing and building a new computer.

Do you have a reference for that spelling distinction?

I’ve only seen “analog” vs “analogue” as a distinction between American and British English. (Similar to “color” vs “colour”.)

For what you’re talking about, I’d call “analog electronics” vs “analog computing”.

I have one of these,

It’s a “gear” based calendar that works by matching the month with the year on the top, and then viewing the output calendar month at the bottom. It’s a finite-state machine, implemented non-electronically.

See also,

There were programmable analog computers that could model a system of your choosing. The key element was the operational amplifier that could be used to integrate or differentiate a signal or to add or subtract signals. There were also modules that could multiply or divide signals. These were based on circuits that had exponential or logarithmic response. These computers were programmed with a “patch panel”, which was used to wire up the op amps to solve the differential equations that described the system of interest. In the early 1970’s I remember the excitement about “hybrid” computers," which combined a digital computer with an analog one.

The analog versus analogue usage seems to have become something of a de facto use in some areas. Enough that a lot of people will recognise it as an attempt to distinguish meaning.

But it isn’t any sort of anointed defined meaning. Maybe common usage in time will bring such about. Or the world will move on.

A brain is an analog computer and it doesn’t have to be built for a specific problem.

Perhaps not but you would want to use the best tool to solve a specific problem.

You missed the distinction Chronos was making. The brain is an analog computer, but not an analogue computer. The structure of the brain doesn’t make an analogy with the problems it solves.

As an example, consider the Deltar computer:

Specifically:

The design of the Deltar was based on the hydraulic analogy of the properties and behaviour of water and electricity. Working with analogs of quantities such as the water level, rate of flow and water storage, the design for the calculator basically used the electrical quantities charge, potential, inductance and capacitance.

Each flow element of the physical system had an analogous electronic element.

That said, I’m not sure that distinction is that reliable in practice. Especially since the boundary is somewhat fuzzy. The Deltar probably could have been used for other problems, since it is at least partly reconfigurable. But the hydraulic analogy is a very close one, especially in that direction (you can’t simulate all electronic behavior with water, but you nearly can in the other direction).

This. ‘Analogue’ is just how we write the word that Americans write ‘analog’

It may be that some distinction has arisen (in a similar way, perhaps, to how I have to refer to hard drive storage as ‘disk’ rather than ‘disc’, even though the latter is the UK spelling for a flat circular thing)

If the distinction truly exists, wouldn’t your linked articles be talking about ‘analogues of quantities’. Rather than ‘analogs of quantities’?
The article appears to be a counterexample to the point being argued.

The distinction is certainly there in the hardware–that is, there are physical devices where the properties have direct analogues with what they are trying to simulate, and there is something approaching a 1:1 mapping between components in the device and in the target system. While on the other hand, there are essentially general purpose analog computers where the properties are not so analogous, or at least correspond to things that don’t have great physical meaning.

Whether these two types of machines can be reliably distinguished by the spelling of “analog” I couldn’t say. And I acknowledged above that the boundary is rather fuzzy in some cases. But regardless, dougrb seemed to be missing the point. The human brain would clearly be on the “analog” side of things in Chronos’ taxonomy, being a general-purpose analog computer.

The distinction I believe actually exists is between digital vs analogue (or digital vs analog in US English) - that is, discrete vs continuous.

Sorta true, for certain definitions of “specific problem”. Analog computers (or analogue – I’m not sure if the spelling distinction is generally understood to indicate anything beyond a distinction between US and British spelling) certainly have been built for specific tasks. But with the advent of solid state operational amplifiers, electronic analog computers like the EAI 8800 (circa 1967) could handle such a large class of problems that they were considered to be effectively general-purpose.

For that matter, the humble slide rule was a pretty general-purpose analog calculator. Conversely, early digital computers like the ENIAC (c. 1945) were built primarily for specific purposes; the ENIAC was funded by the US Army and intended mainly for computing artillery firing tables. Probably the most significant difference in scope of capabilities between analog and digital computers was that the capabilities of the latter scaled virtually without limit as speed and memory capacity grew.

It’s not that simple. There are aspects of neuron-synapse behaviour that involve a continuum of values and firing thresholds, but there is also evidence that higher-level cognitive functions closely parallel the symbol-processing paradigms of digital computing that can be modeled by a Turing machine (see, for instance, the computational theory of mind).

I think there is also a distinction between “programme” and “program”.

Ostensibly, it is just the UK vs the USA, but over time, at least here in the UK, the former refers to a show or a plan - “TV programmes”. The latter is reserved for computer stuff.

This is an odd one. As you say, the U.K. has adopted the U.S. spelling for the computer program sense only.

A U.K. “TV programme” would usually be called a “TV show” in the U.S., do Americans use “TV program” at all? How about “theater program”?