Technology doesn't work that way - SamuelA's Pit Thread

It is a trivial thought experiment to show that flipping something in reverse is the same as doing nothing at all. Just so we’re clear, you do know that “flipping a problem” and “doing the problem in reverse” is the same thing? Are you sure you wish to make the claim you want to flip it in reverse?

I mean, anyone can claim to be an expert but I can toss out the names of three University of North Dakota (Fargo, North Campus) papers which prove that flipping a problem in reverse means doing the same problem the same way.

Can you?

See, this is what I’ve been saying and will continue to say. If you do not understand this simple function, then we are left to doubt your actual competence in both flipping and reversing problems.

I’m beginning to wonder if SamuelA is even a real person. It’s as if someone wrote a bot to simulate someone afflicted with Aspberger’s Syndrome - like the Aspy2000 or something.

No offense to anyone suffering from Aspberger’s, it can be very debilitating. I don’t mind poking fun at SameulA, he obviously takes no offense to the insults. But am I still an asshole for being critical? I don’t know, it’s kind of like that tree falling in the forest question; If I insult someone and they don’t care am I really insulting someone? And I still can’t be certain that SameulA is 100% human.

How nice. You don’t even understand the argument I made, and you’re trying to put me down anyway. Again, you’re just putting the underwear on your own head. An arbitrarily smart and unbiased agent reading this thread would conclude that you, sir, are a moron.

My God, you’re a fucking ignoramus! I don’t think even you are so dumb that you can’t go back and read more carefully. I said no such thing as you claim and in fact said quite the opposite. So now we can count reading comprehension among your multitude of faults.

Truth tables can be implemented in any arbitrary set of logic gates. In fact, functionally, that’s exactly what logic gates implement. This is neither interesting nor relevant to the discussion of computation. What I said – and I said it explicitly – was that this is not a sufficient condition to implement Turing equivalence. For that you need things like an instruction set, persistent memory, and decision-making branching functions. You need the ability to process symbols in a semantic context, the way that a series of letters and spaces conveys a message to sentient humans, or a tape of symbols conveys meaning to a theoretical Turing machine.

Maybe you can get at least a partial refund on your CS degree?

No, no. It is bras I put on my head, in accordance with the work of Gary and Wyatt (1985). They have demonstrated it is quite trivial to build a hawt woman with computational power, and their work complements many of the theories you discuss here, to our betterment.

Now it has been argued by Max and Ian (also 1985) that the theories of G&W were bogus, that their attempt to create a computational consciousness merely resulted in the creation of an ICBM. However, no evidence has been offered to prove this ICBM ever existed, and Ian’s continued declaration that “it’s under Wyatt’s house, the dweeb!” has been met with skepticism.

But you should know this if you were as learned as you tell us. Personally, I have my doubts. What are your credentials in the weirder sciences?

Umm, what? Ok now I think we’ve blown right past each other.

I’m saying what the brain does can be functionally mimicked by computation, and what I mean by computation is digital logic implementing truth tables. Semantics? Symbol processing? tapes? None of that matters.

Now it so happens that I know any Turing complete machine can implement any truth table, hence I say you can emulate a brain with Turing machines. Also means an AI based on Turing machines can be constructed that will be able to perform any cognitive task a human does. (albeit it might be too slow to be useful)

This doesn’t mean the brain has all of the features of Turing machines.

So you’re telling me that all your insults were based on a false understanding of my point?

Ok, well, do you agree with my real point, then? My real point is that today we can build brain-like computer chips. We haven’t quite needed all the complexities of a real brain, not yet. These chips are primarily a single computational subunit called a multiply-accumulate. Out of all the features of modern computers, they really just need a shit-ton of this one single functional unit.

If you work out what a brain synapse functionally is, the math operation is 1*(transmitter_vesiclestransfer_functionreceiver_function) += charge_level. Or multiply accumulate. Though if you were going for strict brain emulation, you would not use quite the same algorithms that modern AI research is using.

And I made arguments saying that we can build ever bigger arrays of these chips, and both emulate brains belonging to once living people, if we wanted to, as well as build artificial intelligences that have the same functionality or better that we have.

That’s my point. I don’t give a rat’s ass about symbols or displays.

Oh, just checking.

My working definition of computation - truth tables - is exactly correct.

*Computation *is any type of calculation[1][2] that includes both arithmetical and non-arithmetical steps and follows a well-defined model understood and described as, for example, an algorithm.

The study of computation is paramount to the discipline of computer science.

Guess I won’t need to be asking my alma mater for my money back yet.

You’d have to wait for that traveling carnival to come back to your town first.

What a childish insult. You call yourself an adult, huh.

No, my insults were based on the conclusion – as shown in previous posts, and more below, that you don’t know what the fuck you’re talking about.

Of course you don’t. Because you don’t understand how a Turing machine defines computation or why it’s a central tenet of computer science.

Answer these questions. Do it in you own words, as I did when trying to explain these concepts to you. Try to redeem the impression that you’re the total ignoramus that everyone now sees on full display.

  1. I have a traditional electronic calculator built out of logic gates that implements truth tables and can add, subtract, multiply, and divide. Is this device Turing equivalent? Why or why not? Explain your answer in relation to the Wikipedia extract that you plagiarized in post #367.

  2. Can the computer on my desk be regarded as Turing complete? Why or why not?

  3. Are cognitive processes computational? If so, why is this a fundamental point of contention in cognitive science?

And I’ll say a couple of things here while awaiting your responses.

I admit that I thought for a brief while, based mostly on your claims rather than your actual posts, that education may actually have hammered some theoretical knowledge into your thick skull and you were just lacking practical experience, and your obnoxious pomposity was just a quirk of youth and immaturity. I’m now convinced that Tripler was literally correct. You really are a spectacularly ignorant blowhard. It’s amazing. I’m no longer being facetious when I say that if you really do have a CS degree, you should demand your money back. You remind me of an M.Sc. candidate that was once foisted off on me to “help” with a project. He was a fucking waste of space.

I had a sort of New Year’s resolution in the back of my mind to try to be more conciliatory even here in the pit – but I’ve found that with you it’s impossible, though you can see the tattered remnants of it in the last sentence of my post #358.

Answer the questions.

I think if you did a little Christian fellowshipping with Czarcasm you’d find he’s really easy to talk to and not at all insulting.

The thread reads nicely and amusingly if the SamA posts are skipped.

If he knows as much about religion as he does about science, we’ll have a grand time.

If he knows as much about religion as he does about science, he probably thinks that “Jesus H. Christ on a pogo stick” is the opening line of the Pontifical High Mass.

No, everyone knows the opening line is; “Spectacles, testicles, watch, wallet.”

Nice rebuttal dipshit. Still haven’t figured out this whole “Pit” thing I see.

  1. No, missing features. No branch statements at a minimum.

  2. Yes. If it had infinite memory it could emulate any other computing system.

  3. Yes. Computation != Turing complete. One is a subset of the other. Obviously.

The point of contention is that neurons are executing branch statements and you can obviously build a neural net “from the outside” to emulate any other computing system. But the human brain wasn’t built from the outside by some master planner, it was made procedurally from relatively simple rules and all the neural weights are somehow derived from actual feedback from, well, chemical signals from the animal and also specific neurotransmitter-receptor pairings each have an initial gain factor.

You know, a topic you might even appreciate. If you were wanting to build a front end to protect a computer system, and you wanted to prove it was absolutely reliable, you’d actually want that front end to *not *be Turing complete.

For instance, you could implement your TCP/IP stack in an FPGA (and don’t cheat and use a software defined CPU like Xilinx offers). Each memory buffer fundamentally can’t overflow, there’s nowhere for the extra bits to go. Any corruption can’t result in the machine doing anything unwanted, as each subsystem in passes data to another component that only has the features needed to perform that subsystem’s functional purpose.

Eric Drexler has pointed out that if silicon were basically free, our whole concept of software is actually obsolete. You’d have a lot more reliability if computers didn’t have general purpose capabilities, but instead, each application gets a dedicated chunk of silicon logic gates and memory to execute that application’s features. This would make most forms of computer hacking impossible.

To explain in rough details how that would work, right now, say there’s a button on your screen from your web browser. When you click that button, instructions cause a branch processing unit in one of your CPU cores to evaluate what to do next. The ‘Eric Drexler way’, there’s a chunk of circuitry just for your web browser and nothing else, and every button you can click actually triggers circuitry for that feature only.

There are obvious practical difficulties with this, but you might realize there are advantages, as well. Latencies would be almost zero - the computing system would respond in realtime, limited basically only by propagation delay. And it would be nearly impossible to hack. If a specific application didn’t want to share data with any other application , it could specify a hardware memory buffer that has no wires going to anything but circuits dedicated to that application.

. . . and there’s the redirect. Notice, ladies and gentlemen, when a SamuelA is challenged in the wild, he’ll make a masterful illusion of himself (or he’ll drop ink like an octopus in a mini-rant and/or swim to safer waters) to distract you from the arguments that he doesn’t agree with. Smoke and mirrors, ladies and gentlemen. Smoke. And. Mirrors.

Don’t worry SamuelA, I’ve just not had the time to walk through all of the quagmire of your nuclear weapon/spacelift posts. In due time, sir. In due time. Oh, and 4.96 E22 kg, dependent on propellant, but I’ll put $20 down that you’ll somehow find fault with that.

Whatever happened to the DryPasta idea?

Tripler
Smoke. And. Mirrors.

4.69 E22 kg. Typo.