So did bill gates and paul allen make the worlds firsl computer emulator?

in bills euology to paul allen seen here:https://www.msn.com/en-us/money/markets/bill-gates-what-i-loved-about-paul-allen/ar-BBOxZn2
He wrote this :
"We decided to start our next, more successful venture in December 1974. Paul and I were both living in the Boston area—he was working, and I was going to college. One day he came and got me, insisting that I rush over to a nearby newsstand with him. When we arrived, he showed me the cover of the January issue of Popular Electronics. It featured a new computer called the Altair 8800, which ran on a powerful new chip. Paul looked at me and said: “This is happening without us!” That moment marked the end of my college career and the beginning of our new company, Microsoft.

In those days, the chips were so limited that you couldn’t do what’s called “native development”—you couldn’t use a machine with that chip in it as you were developing the software for it. That made writing code for those chips pretty challenging. Paul had a great idea: to write some code that would let us emulate those chips on a more powerful computer, then port it over to the machine with the less powerful chip. That breakthrough was important for a lot of Microsoft’s early success, and Paul deserves credit for it. "
Now days emulation is mainly used for old video/pc games and for making things work on other things its not exactly meant to ie you can put android on a pc with the right emulator

but if they were the first to come up with the idea I wonder what bill would think of MAME ect …….

This is before my (professional) time, but I seem to recall that writing an assembler/compiler/interpreter for one machine was common on others, such as writing a BASIC compiler on a FORTRAN computer.

And as far as not being able to use a computer to write a compiler for that same computer, nonsense. I wrote compilers and debuggers for the 8080 on an IMSAI 8080. In fact, I had no choice, since I couldn’t afford anything else in 1976.

One debugger I wrote was actually an emulator. In order to run code in a debug mode, I had to simulate an 8080 on an 8080, then interrupt each instruction cycle for register analysis. Slow, but it worked.

It was indeed very common. For instance Digital Equipment Corporation (DEC) used a PDP-10 cross-assembler (PAL10) to develop system software like the FOCAL interpreter for the PDP-8 minicomputer. And years before any of that, when I was a wee tot in first-year university, our first programming assignments were written for a simplified imaginary computer that ran as a simulator on the computing center’s mainframe.

Yes, Bill was somewhat oversimplifying the case. However, a big computer like the PDP-10 – which was also what Gates and Allen used to develop BASIC for the Altair 8800 – offered the tremendous advantage of disk storage, a fast printer, a whole suite of powerful utilities and a real OS to manage it all, which made the development job fantastically easier than it would have been on a raw Altair 8800. I’m not sure exactly what they wrote for the PDP-10, but they would have needed something like a cross-assembler as well as an 8080 simulator. IIRC Gates and Allen didn’t even have an Altair computer, even if they’d wanted to use it for development. When Gates flew down to MITS headquarters in Albuquerque with the paper tape of BASIC, it was the first time it had ever run on the native hardware.

Emulators definitely pre-dated Microsoft by quite a bit. The earliest emulators that I am aware of were for IBM mainframes in the 1960s. Newer IBM mainframes often contained emulators for older mainframe models so that customers could run software on the newer mainframes that had been written for those older mainframes.

According to Wikipedia, IBM coined the word “emulator” in this sense in 1963, but they used it specifically to refer to using microcode to allow one computer to run another’s machine code. Apparently they had used software emulation even earlier.

While I realize the word “emulator” has a long and storied history in computing, I’ve always believed that it was usually misapplied by being used in contexts where “simulator” was more appropriate. To “emulate” means to “meet or exceed” the behavior of some other entity, so to me it carries the connotation of “better than”, or at a minimum, “at least as good as”. But historically this has not been true, since a simulation will usually be slower than the native hardware unless it’s running on dramatically faster hardware than the original, or potentially when using a microcoded approach on new hardware. Faster than legacy operation is true today when running simulators of legacy hardware on modern computers, but was not typically the case.

It appears that the term was coined, as per the Wiki link above, by IBM to distinguish a new method of simulating older hardware in microcode from ordinary software simulation, a method that had the potential to indeed be superior to both simulation and even to the original hardware. But as frequently happens with language, the term was widely co-opted to mean any kind of simulation, and that meaning became entrenched. Thus leading to oxymorons like the one quoted in the Wiki article: “Yes, it’s possible for a Commodore 64 to emulate an IBM PC, in the same sense that it’s possible to bail out Lake Michigan with a teaspoon”.

Porting software across machines this way was well known by the 1960s. I think there was even some stuff like this done on Burroughs machines in the 50s.

Let’s get back to the 40s: People in the Magic group were able to deduce the operation of the Japanese Purple encryption machine and built their own replica to emulate it. (All from the code breaking without seeing the actual machine.)

If you want to get really basic, you can get back to 1936 and Turing’s famous paper On Computable Numbers, with an Application to the Entscheidungsproblem and related papers. (I have a copy of this in my basement, btw.) He convincingly argued that all computations can be emulated on one of his machines.

So the idea is pretty well ingrained into Computing from nearly the beginning.

They did not, it was an established technique.

Gary Killdall gives a very nice account of those early days of programming 4bit 4004 and 8bit 8008 microprocessors. He wrote a simulator for the 4004 on an IBM SYSTEM 370 mainframe. Getting a program into one of these was hard work. Lots of paper tape. He developed BIOS programs and a monitor and the first software to handle storage devices that became a disk operating system CP/M. Gates and Allen made a name for themselves writing BASIC interpreters for microprocessor boards.

He tells of how these two rascals, as school students, hacked into his computer with a modem. :dubious: It is a very interesting memoir.

Excerpt about Kildall’s philosophy toward software:
Gary viewed computers as learning tools rather than profit engines. His career choices reflect a different definition of success, where innovation means sharing ideas, letting passion drive your work and making source code available for others to build upon. His work ethic during the 1970s resembles that of the open-source community today.

Meanwhile Bill Gates, when he found out that the Homebrew Computer Club was distributing free copies of his BASIC for the Altair, wrote a strongly worded open letter to hobbyists condemning those who gave away his property for free, blasting the idea of “free software”, and claiming that he and Allen had used $40,000 worth of computer time to develop Altair BASIC.

There’s a rich irony both in that claim and in the infringement complaint, because most of the computer time used for the development was on the PDP-10 run by Harvard and paid for by the Defense Advanced Research Projects Agency for use in DARPA research projects. In effect, Gates stole computer time from the US military, and in fact got into some amount of trouble over it. Only later, when Harvard administrators quite rightfully kicked him off the PDP-10, was Gates actually forced to pay for commercial timesharing, but by then his software was well developed and established.

There’s a similar story regarding Windows itself. When Steve Jobs accused Gates of stealing the windows paradigm from Apple, Gates pointed out that Jobs had in turn stolen it from Xerox PARC, which had invented the bit-mapped windows GUI idea, the mouse, the object-oriented approach, the Ethernet LAN, and pretty much all the basics that define a modern PC. Indeed Bill Gates found no irony in making the analogy that he was like the guy breaking into a house to steal the TV, only to find that someone else had already stolen it.

But IIRC the basic BASIC by Bill and Paul was pretty much written by hand in native Assembler. The trouble with compiling a complier etc. (or in the case of BASIC, an interpreter) is that it uses subroutines, while humans in those days could better optimize code when the goal was to fit something into 4K or 8K of RAM.

The key was the decisions that went into specific processes - for example, when the instruction said - GOTO 1020; - should you go forward from the current position, or always search for line 1020 starting from the beginning of the program? The obvious efficiency is to see what line you are on, but does that check routine consume precious bytes of RAM? Add complexity? Ah, those were the days.

(The earlier version of the Commodore Pet would every so often go catatonic as it did “garbage collection” when the link list of strings was full and it went through the list to see which ones were no longer used…)

My dad related doing programming on some old IBM(?) device that used a drum and dozens of read-write heads for main storage in the 1950’s or early 60’s. The programming process was to use a giant spreadsheet (sheet, as in paper) where you filled in the program as machine instructions and put them in the cells on the drum - on paper. Each instruction consumed a number of clock cycles, and also contained the cell address of the next instruction. half the fun of programming was to determine how far the drum would rotate while processing that instruction, and locate the next instruction as close as possible to that far along the drum. Otherwise, you could wait an entire rotation of the drum and the program would run very slow.

He was likely referring to the IBM 650. It used drum memory and required that kind of optimization.* IBM provided paper “memory charts” to facilitate it, but pretty soon that work was done automatically by the 650’s assembly program. The 650 dates from the 50s, announced in July, 1953, and the first one was installed in December, 1954. Its assembler was called SOAP, for Symbolic Optimal Assembly Program, and the “optimal” part referred to the fact that it would locate each instruction at a drum location that corresponded to the execution time of the previous one, so the drum’s rotation would bring up the next instruction at just the right time. However, there was no way to coordinate that with fetching data from tables and the like. In fact the SOAP assembler had a pseudo-instruction – Block Reservation (BLR) – to allocate drum memory blocks for such use to prevent the optimization process from using them. The drum had 20, 40, or 80 bands (tracks) around it depending on the chosen capacity, and I’m pretty sure each band had its own fixed head. The drum was remarkably small (picture here) and spun at 12,500 RPM.

You could also get optional core memory for the 650 – a whopping 60 words! Not 60K. 60. A remarkable fact considering that today programmers waste gigabytes just to save themselves ten minutes of programming effort.

  • Even I am not old enough to have ever worked with an IBM 650! This is info collected in my capacity as a computer history geek. :slight_smile: