What Will We DO With 1024-bit Computers?

I’ve owned 8-bit machines, 16-bit, and 32-bit in my life and we’re now up to 64-bit machines. It’s entirely possible that within my life time we’ll have 1024-bit machines. I imagine, however, that before those have been developed, we’ll have photorealistic images generated on the fly for things like gaming, and I can see scientists and researchers drooling over the speed of how fast a 1024-bit machine can handle complex equations, but what are the rest of us going to do with that kind of horsepower under the hood?

Porn, of course.

In your lifetime? How 'bout next year? There’s already code being written for it, and papers speculating about it. Google “1024 bit word”.

But, yes, I agree. Porn, sharper than ever, with new input-output devices that will be banned in many Southern states before they ever hit the shelves.

Maybe we could watch simple applications start in, say, twice the time that was normal in the days of MS-DOS?

(Forget it - it’ll never happen.)

We can download the latest security patches much faster.

Good point. And our computers will crash MUCH faster than before.

And it will take three weeks for a single image to download over dialup.

[sub]why yes, I am sick of waiting for affordable broadband, how did you guess?[/sub]

I smell a business opportunity, y’all!!!

Or, is that just tuna fish?

:smiley:

<GQ Hat on>

Actually, 1024 bit computers would be slower than the ones we have now. There aren’t a lot of 1024 bit calculations to do. Even stuff like graphics is better done on specialized processors, which will always beat general purpose ones. A 1024 bit computer would have a lot of its processing power and data paths unused most of the time, taking up space (and a bigger chip is slower) and using power.

The reason 64 bits is better than 32 is that we were running out of 32 bit address space. Let me know when you need a memory (even a virtual one) bigger than 2 ** 64 bytes.

And of course download speeds have nothing to do with word lengths.
<GQ Hat Off>
Which reminds me of an article about how HDTV is a pain for porn, since you need to do a lot better job with makeup when you can see every blemish. The airbrush got invented for a reason, after all.

“I don’t see why anyone would ever need more than 640K RAM.” Bill Gates

“There is no reason for anyone to have a computer in their home” - Ken Olson, 1977, founder of DEC.

“I think there is a world market for maybe five computers,” :wink:

Still, a 64-bit address space lets you have up to about 4 billion GB of RAM. I may be just as bad as the earlier futurists, but I can’t imagine any need for more than that.

Your cite claims that the quote is apocryphal.

He wasn’t talking about PCs.

In all fairness to Voyager, the ability of a 32 or 64 bit system to address things is growing exponentially compared to something like the actual size of RAM where 1 mb is one 1mb of RAM.

So, the jump to a 1024 bit architecture is a much larger one than it appears.

Yeah, like George Tenet didn’t mean that it was a slam dunk that Saddam had WMDs. He said it, exactly as I quoted. And, aside from the part about “controlling every aspect of our lives”, he was wrong about even what he DID claim to mean. I would LOVE a home where the computer switched the lights on and off, prepared the meals, ordered supplies that were running low, and two dozen other things I’m too lazy to think of (but if you asked my home-controlling computer, IT could tell you). :stuck_out_tongue:

I work in a supercomputing institute, and the computers are used to do heavy research into genomes, physics, chemistry, and other scientific fields. You would be amazed how many of the companies that make software for this research still have 16 bit! Only a handful have released 64 bit versions.

So in short, porn. It drives the technology much more than the sciences.

Strong AI (that is, attempting to build computers with human-class intelligences) might need that level of RAM – there are more than a trillion neurons in the human brain, so representing each neuron plus to connections to each other would get you into about that amount of RAM.

Of course, usually these sorts of problems aren’t solved brute force, the brain is suspected to be highly redundant, and evolution rarely selects the most efficient solution for a given problem (especially a highly complex problem). And then there’s the question of how to access all that RAM – today’s CPU’s would take forever just to access each of these addresses once: doing real computation on them would be too slow to be useful. But CPU’s are still getting faster and/or more parallel, too.

I don’t actually disagree with you: I can’t see the current serially-addressed, monolithic address space surviving much beyond the 64-bit era, but I wouldn’t be that surprised if I turned out to be wrong.

It seems to me it’s more like 17 billion GB of RAM. Cause, you know, 4 billion GB wasn’t actually going to be enough for my purposes…