I'm afraid of the "Singularity"

He also tried to use his work experience, which isn’t directly relevant, to give extra weight to his opinion.

You were pretty dismissive of people when they were honest about not being involved in quantum computing and/or AI, but you failed to offer your own qualifications.

That isnt just rude, that is mendacious. If you wish to dismiss someone like that, at least do it with some authority.

In what way are you involved in AI research?

Me? Not at all. Same?

What’s your point? Are you looking for an excuse to threadshit?

Computers control factories already. An IC Fab is basically controlled by computers, with people monitoring and dialing in the right recipes (which are developed with computer assistance.)

I agree that computers will perform simple surgery - with a doctor watching. Computers have the advantage of being programmed by the best in the field, and thus do better than many people.

Case in point - around 20 years ago I managed a computer aided diagnosis tool - diagnosis of failing circuit boards, not people. It learned. It was designed to learn from repairs that worked, but we discovered that the senior repair people preprogrammed it with their knowledge when things came up. The system didn’t do as well as they did, but it did better than what junior people could do. It learned, but it was not AI - it used a purely statistical approach. Interestingly enough a competing, AI approach - an expert system - never got anywhere because no one had the time to program it.
So everyone expects an AI to learn - right now none have the basis to learn.

Tell me about that statistical approach. I wonder if it might be scalable to more complex tasks.

Neural Networks are basically a statistical approach. A multilayer neural network is a universal function approximator.

The current best score on handwriting recognition of the MNIST database is based on a simple many layered/many neuron network and is about at human error rates. Which is a little surprising because most of the recent progress had been in more complex/sophisticated methods.

Recently (last 8 years) significant progress in multi-layered networks has been made by Hinton using a method of training a special type of network (RBM). By using the output of one network/layer as input to the next, he has devised a system that can be used to train multi-layer networks more efficiently/accurately than previous methods which results in better overall solutions as the various layers tend to be classifying at different levels of abstraction.

I PMed you a reference to a precursor statistical diagnostician. The one I mentioned doesn’t have a published paper describing it. It definitely scales - it was used for returned circuit packs for the #5ESS digital switch in production. This was not an academic exercise.

Everything is equivalent in a sense. Neural nets can be thought of one way of doing a regression by adjusting the coefficients. Neural nets weren’t nearly as popular back when this work was being done.

thank you. That sounds intriguing.