It pretty much saved the company. Up to that point, many people thought Apple would go bust. The iMac opened up new markets for Apple in the home and education fields. And in looks, it was completely original. As an example, in design agencies, designers used macs and ‘the suits’ used PCs. With the iMac, suddenly everyone in the company was on a Mac.
It wasn’t even the best Apple had at that time. The Apple II GS could do 320X200 in 16 color or 640X200 with 16 dithered colors. It could actually do 4096 different colors, although not all at the same time. The first Mac did 256X256 in black & white.
The example I cited gave only a very rough idea of the speed of a Basic program running on the Mac introduced in January 1984. That was the very earliest Mac sold to the public.
That 1984 Mac was an excellent machine in many ways (mostly graphical applications). But never for running mathematical computations. It was never designed to do that.
The early Mac had several processors. But most of them were used to support graphics. Not computations.
For anyone who would compare the speed of the Basic language running on a Mac with other Basic languages, your must keep one thing in mind.
An Interpreter runs perhaps 100 times slower than a compiled language for certain tasks. The Basic language can be implemented as a compiled language (although it’s hardly ever implemented that way) or as an Interpreter.
It is very unfair to compare the speed of two Basic programs unless they are both implemented in the same way and use the same kind of CPU.
Very unfair to compare a machine running on an 8088, 8086 or 80286 with another machine running on a different kind of CPU.
Very unfair to compare a machine running a Interpreted Basic with one running a Compiled Basic.
Very unfair to compare a Basic language program running on one kind of CPU with another Basic language program running on a diff kind of CPU.
Also, very unfair to compare a Basic language program with a Fortran or Pascal language program. They are almost impossible to compare to one another.
I stated my example just to show that I couldn’t buy a Mac because I needed a PC that would run mathematical computations - specifically Statistical Analysis.
Comparing different CPUs is unfair? Fair doesn’t enter into it. The whole point of benchmarks to see which CPU is faster for a given task. That’s like saying a race is unfair unless both runners use the same body.
But it kind of has to in order to run graphical applications well. If it’s bad at math, it sure as hell ain’t gonna be good at graphics. Those two concepts are intimately linked (so much so that these days a lot of heavy math gets pushed over to the GPU.)
I mean, the BASIC example just doesn’t make any sense to me. It has to have been a badly broken version of BASIC or something, because I can’t see any way that it takes more than one second per character to display. (I’m assuming your program is something like 10 FOR T = 1 TO 10: PRINT T: NEXT
The Mac sure as hell ain’t counting that slow. Counting is one of the most basic functions of a computer and if it can’t do a simple increment or decrement quickly, it sure as shit isn’t going to handle a GUI. I know the increment is not going to be handled the same as a simple machine level increment instruction, but even with the layer of interpretation and the extra steps of assigning a memory space for the variable, storing its address, and all stuff that has to happen under the hood of a BASIC command, that still seems way, way, way longer than necessary.
And the benchmarks I posted seem to say that MacBasic was, indeed, a good bit faster than MS BASIC at the time.
The only thing I could see is if the BASIC routines were somehow horribly inefficient. Maybe whatever BASIC you were using, its implementation of the screen display routine was slow due to some trickiness in the display routines. But even that seems somehow odd to me. Would you have the same sort of problem with just a simple PRINT statement? I doubt it.
I mean, doesn’t this seem a bit bizarre to any of you other computer types out there?
^Sorry, just under a second per increment. And it looks like you might not even be calling a print instruction after every increment. In that case, I simply can’t believe a properly programmed BASIC program would take 15 seconds to count to 20. Something has to be wrong or is missing or misremembered with this story.
My career working on IBM mainframes started in the mid-80’s and I’ve been working on them ever since. At places like NASA, Dept of Justice, NIH, the Pentagon… I don’t think I would ever have gotten this far working on Macs.
Interesting. (As a current Mac lover), I don’t remember it being that big a deal. I just remember it being kind of a side note, but nobody who wasn’t into Macs at the time paid any heed. It wasn’t until the iPod/OS X and all that that I started to see mainstream following of it.
It was a big deal only to die-hard Mac fans, which was less than 3% of the market at the time. It was pretty much a joke to the computer world at large. And while there are a lot of revisionist articles that claim it was the iMac that saved Apple, news stories from the time tell a different story, like this from NPR or this from Seattle Post. Both before the iMac was announced.
Note that the self-interest had to do with anti-trust suits. Microsoft needed a viable competitor to defend against claims that it had a monopoly on computer OSes.
I have a few old BYTE magazines saved and the review for the first Mac is in one along with an interview with three of its designers.
This is from the time when your computer could easily cost more then your car.
The list price is $2495 or $2990 with the Imagewriter dot-matrix printer. For another $495 you can either get a second 400K byte floppy drive or a 300/1200bps modem should you want to get married on CompuServe.
And you’ll need some software. MacPaint and MacWrite is $195, but that’s for BOTH!
By comparison from the same magazine, the Heath Hero Jr. was only $1000. So you have to figure do you want one Mac or three robots?
The designers were proud it only had one mouse button and has no capability for expansion so no ones Mac would be sullied by third party hardware or sufficient memory. How very Apple of them.
In the benchmarks it got an “*” on their sieve test because it didn’t have enough memory. The Apple II and PC ran it. Another benchmark of doing 10,000 single precision multiplications and divisions took a rather average 78.9 seconds.
It was the early 90s and I preparing for graduate school. I decided to replace my word processor with a computer. I began looking around and testing them out in various stores.
I worked with someone who was a Mac fanboy and he guided me towards Apple. After spending quite a bit of time looking, I found I liked the Mac and decided to go that route.
After grad school my computer finally died and because I where I worked was all PC, and because Apple products were too expensive, I switched.
When my final PC died about 4 years ago I went with the MacBook Pro and now have an iPad.
While I am not a fanboy Apple user I do prefer their products. I have had no issues with them other than the expense and if that becomes important in the future, I could easily switch again.