I’m pretty certain I got that from The Calligraphic Button Catalog, since I’ve known it for a good three decades or more and I only learned about xkcd about 5 years or so ago.
Doesn’t seem to be listed anymore, though, so maybe it’s something I heard on Usenet or on IRCII.
That may well be true, I didn’t have any experience with other Basic variants at the time. What I did have ca. 1985 was a Basic extension module called Simon’s Basic which offered additional structures like WHILE, DO…WHILE, REPEAT…UNTIL loops and real subroutines and functions. It also had much more convenient support for graphics and sound functions. I used to transfer the Pascal programs I wrote at school into Simon’s Basic on my home computer. Still have my Simon’s Basic programming manual somewhere.
ETA: one of those projects at school was to write a two-player blackjack game. I transferred it into Simon’s Basic and for fun added a mode where you could play against the computer. My real-life strategies for playing blackjack weren’t very sophisticated, so the intelligence of the computer player wasn’t very remarkable. It didn’t count cards for instance.
The original form is a classic aphorism of computer technology. It’s actually quoted in that “What If?” column, as a starting point. (The quote itself goes back, apparently to a citation by Andrew Tanenbaum: Andrew S. Tanenbaum - Wikiquote )
The point being, I guess, that even “sneakernet” has grown immensely in capacity.
If I had Doc Brown’s time machine, I wonder if I could take a modern android phone back to 1975, go to one of those organizations who were about to purchase a Cray-1, and tell them “Pay me half of what that Cray costs; I can do better!”
With a relatively modest amount of money and equipment I can buy off the shelf in 2022, is it practical to hide the phone inside an impressive-looking case, and actually use it for the typical tasks a supercomputer of that era was used for?
Well, you would have to write the software for the phone that ran on the supercomputers of the day. I doubt there are Iphone or Android apps for weather modeling on a grand scale or something similar. Next problem would be how to get the data to the phone. There’s no common interface.
It would be like trying to wow the locals in 1960 with a GPS. It would have a pretty display but all you’re gonna get is Finding satellites blinking in the middle of it.
My drive never failed, but I had plenty of issues with individual disks. I wasn’t using them for archives/backups, so it wasn’t a huge problem; I’d just go get a new disk when one started having problems and start using it to jockey files to and from work.
Reading the question, I think s/he meant doing this in 2022. My home wifi is 20 to 300 times faster than any network connection available in the 70’s; I wouldn’t think connectivity an issue. So yes, if you could find or make software to run on the thing, you could do 70’s era supercomputing tasks on that phone.
That said, the killer feature of a Cray 1 was that it was a vector processor: you could, in hardware, point at an array of data with one instruction and use another to perform an operation on everything in the array. An iPhone’s CPU doesn’t do that, though arguably the GPU does.
Using the phone I’m posting on right now, I could port the weather forecasting software of the time to run on my phone. Natively. No external support needed. And it would run effortlessly, since it would be ridiculously simple and small compared to the apps already on the phone.
The real issue would be getting the weather observation data needed to do a forecast into the phone, since communication interfaces would be almost entirely incompatible with what a smartphone would support.
Improved processing speed is nothing new. In 1975, in grad school, I wrote an emulator for the LGP-21 mentioned above (vintage 1962) in Lockheed Sue microcode, which I executed by simulating the Sue on a PDP-11 with a simulator written in Pascal.
It ran faster than the original machine did.
Apple’s current iPhone CPU architecture supports a 16-element SIMD (Single Instruction Multiple Data) capability, which is a smallish vector processing core.
Qualcomm’s Snapdragon processors (used in most Android devices after 2017) have a NEON SIMD core with 128 elements.
So Cray’s ace in the hole can be reproduced in modern smartphone hardware.
Vector instructions aren’t an advantage in and of themselves. For a given level of performance, they’re unambiguously worse than scalar instructions, since it’s not always possible to vectorize, and because they’re harder to program (well, they might save a tiny amount of power).
The Cray-1 wasn’t even really a SIMD computer; it used the vector instructions to achieve what is more commonly achieved with pipelines and caches. It only did a single multiply-add per clock cycle, unlike modern CPUs, which might legitimately do 8 or 16 (or a GPU, which does thousands).
GPUs are similar in a way, which is that they keep many thousands of threads around at a time. To hide memory latency, they issue a read on one thread, then immediately switch to another, issue a read there, and so on. By the time you circle around to the first thread, the memory read (which might take hundreds of clock cycles) has finished.
The iPhone CPU is pipelined and has a large cache. Even if you ignore the SIMD instructions and the multiple cores, it gets multiple gigaflops of double-precision math just by virtue of its >3 GHz clock rate and the fact that it can issue a math instruction at least once per cycle. No special vectorization effort needed.
They’re hanging on my Mom’s wall, so someday I’ll get some of them. I hope the one with white beads isn’t ivory.
I was taught how to use it for arithmetic when I was a child, but anything I learned is gone. He also taught me chisenbop, which I can still use to count, but I don’t know the tricks for arithmetic. The only I thing I clearly remember is my grandfather saying that the dual top beads of the soroban made it much more versatile than an abacus.
I’m sure I’ll make just as much of an impression on my future grandchildren when I show them how to use calc in EMACS. “All I remember is grandpa going on about RPN being superior.”
Back when his son was thirteen, a friend of mine said, “He’s gotten interested in sliderules. I couldn’t find any but then I thought of you.” I wasn’t sure whether to be flattered or not but I gave him my Pickett and its instruction book.