The next page in the book of AI evolution is here, powered by GPT 3.5, and I am very, nay, extremely impressed

Microsoft may be trolling for media attention.

I thought it had already done so, and is not the first to do so (Google’s LaMDA doing so last summer.) Though I seem to be pulling up conflicting reports about this.

I’ve seen reports of passing the Turing test, but nothing official under controlled conditions. The Loebner competition involved a controlled set up, but that fell apart some time back. With several Chatbots in the running we might get a real televised competition. Kind of a super bowl for geeks.

Well, now we’re just getting philosophical.

Going back several posts on training/learning. You’re both kind of right.

There are actually two programs at work. The first program creates a neural network and trains it. That program is capable of learning. The output of that program is a network topology configuration, which is a description of all the values needed to recreate the network. All of the edge weights, thresholds, bias, etc.

That topology is then used by another program to create the neural network that is accessible. That takes an input from the user, feeds into the neural network, and the network produces an output. That program decodes that output into the result for the user. This second program does not learn.

(note it is probably really the same code but just has a toggle to put it training mode vs production mode)

To the contrary - realistic.

The second program is actually much smaller. The first program runs in the cloud. The second can be supported by Edge computers.

Do you believe Artificial intelligence (or is artificial consciousness the better term here?) to be impossible? Or just the potential ChatGPT version of it?

It could be. The computationally expensive part is turned off when running in production. Most of my AIs just use one code library and I control what is on or off with compile settings. It helps to just have the functionality shared between the two so that you do not get any unexpected behavior changes due to difference in the code. For example, forward propagating a sample in production is identical to forward propagating a sample in training, so these two pieces of code should be shared/common.

AI labels the processes accomplished by machines. Intelligence is a property of humans. The question is whether current computational machines can bridge the gap. Can machines possess human intelligence. The answer is no. Simply because there is no mechanism within the machine to support intelligence.

Can a machine mimic human intelligence? The answer is yes. In fact it can appear to be hyper-intelligent. But it will never be conscious, or aware or comprehend what it is doing. There is nothing in a computer that would allow it to do so. At any given instant there is a number in the adder. Everything else is static. There is nothing in the computer that would allow it to be aware.

In example, a digital flight simulator just generates numbers that go to displays that humans interpret as being aircraft flight situations. These situations illicit strong emotional responses in the human observers. But the computer is just throwing out numbers. Nobody believes that anything is flying. It’s just a simulation. There is nothing in the computer that supports flight.

In the example of simulating human behavior, the same thing is true. The computer is slinging out numbers that are being interpreted by humans as meaningful text. Any similarity to human behavior is in the minds of the humans. The machine is the same one that produced numbers that simulate flight, There is no more reason to believe that the numbers represent machine behavior than there is to believe a flight simulator is flying. It’s just a simulation. There is nothing in the machine that constitutes behavior.

I believe Human Intelligence cannot be achieved by digital computation. It can be simulated by digital computation.

No argument there. I assume there are some amazing fragments being developed by individuals with equally amazing results, but not yet on the scale of GPT.

What specific mechanism within the moist lump of flesh inside your skull supports intelligence, and how have you verified that this mechanism is not accounted for in AI?

I think you are drawing a distinction without a difference and will agree to disagree.

If it is a digital computer then the only decision making component is the adder and its status register. Intelligence is not a property of digital adders. Perhaps you can explain otherwise.

So, do you believe the flight simulator can leave the ground? If not, then why would you believe the behavior simulation could think? Both illusions are produced by the same adding machine.

John Oliver’s take on AI

Low brow journalistic theater

What made you land on ‘adders’ as the fundamental component of computing? Why didn’t you stop at the logic gate, or a transistor? What’s special about an adder? It’s just a handful of logic gates.

And why do you think there’s no similar process in the brain? What do you think a neuron connected to a bunch of synapses is doing?

In anhy event, there is no ‘intelligence’ in an adder, just like there is no intelligence in a neuron. Intelligence is an emergent property.

There may in fact be differences between the human brain and a computer running a large neural net that preclude consciousness in a computer - we don’t know, because we don’t know what consciousness is or how it’s created, your apparent certainty on the issue notwithstanding.

You’ve been on the ‘adder’ thing for a while, with multiple people explaining why that framing is wrong to no avail. You seem to think an AI can’t learn without modifying its underlying machine code, which is completely wrong. When an AI is fine tuned, additional layers are added to its neural net, and it learns. Its context window allows it to temporarily learn. If an AI can’t learn, prompt engineering would not work.

I found it quite good for a layperson explanation of AI.

What decision making process exists in the lump of flesh inside your skull?