How did the front switches operate on early Altair and similar computers?

Yes, I eventually bought an RS-232 terminal to control computer functions and programming, and installed CP/M for general computing, but by then we were too far along with our own OS to convert for the music project. However, the first CP/M required writing a small piece of custom code (essentially a driver) to insert into the CP/M kernel, as the kernel came without any interface module. Since CP/M was in assembly, we had to write the interface module in assembly or object code also.

And we had to write an RS-232 driver for the terminal. Nothing was plug-n-play!

It has enough of a place in history that I can’t imagine you could not find it a home. If you were not almost certainly on the other side of the planet I I would cheerfully give it a home. There will be others, just a geeky as me, who will appreciate its place and who are closer to you.

All this talk about lights and switches and computer history reminds me that I started a thread some time ago about an enterprising geek who makes working replica consoles of the DEC PDP-8 and PDP-11/70 minicomputers:

Someone may want it. I’ve cleared out the bulk my collection of old computer hardware. I have an interesting collection of chips and a couple of boards of my own design, and that’s it. There are plenty of collectors and museums with examples of all those beasts so it takes someone who doesn’t mind taking up the space yet still loves the technology of yesteryear.

Dude, I would totally be up for a piece of computer history, but an original Altair 8800 has completed listings on eBay of over $2.5K. You might want to look into that.

https://www.ebay.com/itm/154972973891?hash=item24151beb43:g:a6gAAOSwR~9ibbYr

Definitely do that! ^^^ Whoever buys that is going to care of it. I’m gonna hate finding out I gave away something valuable, but not be surprised.

At very least, if there’s a big university nearby that has some sort of “museum” in the computer department, see if they want it.

Yes, I entered a simple program exactly once back in the day - 1977, I think, with a PDP6.

The steps were all the same -
You have a sequence of bytes in machine language that should execute a program.
set the address, set the byte value in binary, using switches. Toggle “store”
Do same for next address. and the next, and the next…
When done - set address switches to entry point of program, toggle “EXECUTE”
The program will of course fail, almost nothing runs right first time.
Use the “EXAMINE” feature to look at memory locations to see what happened.
You can “STEP” one instruction at a time and “EXAMINE” to see where you went wrong as it executes.

I just did a test count loop.

To boot, you probably want a very simple program which says -
poke the value into the IO buffer that asks for a byte (I.e. from a paper tape reader)
read and move it to memory at X
increment X, repeat… until you’ve done Y bytes.
If you’re brave, then tell it to execute that and away you go…

When you’re sure, burn that program into a ROM and make it permanent memory.

Every loader device may be different, so there were no standard IO programs that every computer could ship with.

I have a reprint article someone in the department ordered back in the mid-70’s from Radio Electronics for their cover feature, an 8008 processor home computer. It had secondary project, a board with 256 bytes of RAM. A lot of early computing was about cheap and efficient ways to store a lot of data, for much earlier meanings of “a lot”. One fellow I worked with knew IBM 360 Assembler - he said he had to learn it to optimize his programs because the computer he first worked with did not have enough free RAM (40K?) to load the COBOL programs he’d written and compiled.

The story about Bill Gates is that he read a manual and used a teletype machine to hand-code a minimal basic and punch it onto teletype paper tape, took it to a computer show, since he didn’t have that computer - and it ran first time!

Not Bill Gates, but his partner Paul Allen. And it wasn’t BASIC, but the bootstrap loader for Altair BASIC. Gates did eventually manage to write a shorter bootstrap program.

While on final approach into the Albuquerque airport, Allen realized that they had forgotten to write a bootstrap program to read the tape into memory. Writing in 8080 machine language, Allen finished the program before the plane landed. Only when they loaded the program onto an Altair and saw a prompt asking for the system’s memory size did Gates and Allen know that their interpreter worked on the Altair hardware. Later, they made a bet on who could write the shortest bootstrap program, and Gates won.

Small nitpick. The PDP-6 did not have “bytes”, and neither did most other machines of the original PDP-6 era (the PDP-6 goes back to 1964). The word length was 36 bits, not divisible by 8, and in any case the smallest addressable unit of storage was the word – the entire 36 bits – and this was the amount of data entered for each address. Otherwise your description is pretty much spot-on.

The concept of 8-bit bytes and byte addressability (and with it, the popularity of hex rather than octal notation) was first widely introduced by IBM with the 32-bit System/360 family in the latter half of the 60s, and by DEC with the 16-bit PDP-11 family in 1970.

Someone above mentioned octal as a version of the human-readable form of binary. Multiple-of-12-bit word architectures are a major reason that octal was very common back then: an octal digit is three bits, so a multiple-of-12-bit word length can be encoded as a clean multiple of whole-range octal digits, like a 36-bit word having 12 octal digits of full value. From 000000000000 to 777777777777.

I started out in assembler language programming on a Sperry Univac 1180 mainframe, which was also a 36-bit word machine. It took me a long time to get used to hexadecimal after moving on to more modern systems.

If I had an Altair, I would definitely talk to the guys at ClassicCmp - Mailing Lists

I never did. I started with the 12-bit PDP-8 and PDP-12, then the 36-bit PDP-10, then the 16-bit PDP-11, where hex would be a natural. But DEC had such a long cultural history of always using octal that the tradition was maintained even with the PDP-11. It was IBM, moving from 36-bit architectures like the 70xx series to the System/360, that went all gung-ho with hex notation.

Note this picture of a PDP-11/70, for instance, and how the switches are arranged in sets of three, for ease of entering instructions and addresses specified in octal.

Like all the other members of the PDP-11 family, the 11/70 was a 16-bit word byte-addressable machine with a 16-bit virtual address. The extra switches beyond 15 were used to specify absolute hardware addresses in physical memory normally mapped by the memory management hardware.

By contrast, here’s a System/360 Model 65, showing the switches arranged in sets of 8, corresponding to two hex digits each and labeled as byte sequences.

The Altair was a little before my time, and when I’d see mention of it now and then, back, oh, I don’t know, in the days of magazines or paper books or the early web, it was always presented as if you just had the box with lights and switches and 256 bytes, and for some reason nary a word about how you could build/buy peripherals. So I’d always try to imagine how in the world an expensive box of lights and switches was enough fun for someone.

We had PDP 11/20s in grad school. The one we used to teach assembler did not not have a ROM for the boot program, the one we used for research did. This was 1973. I entered the boot code way too many times, by the end of the term a bunch of students learned it and we TAs were saved.

Your Altair is quite valuable - they sell for thousands of dollars. Please don’t dump it.

I’ve watched several of the Altair videos. It’s interesting the switches are similar to features in a debugger.

My first debugger was on the Vax running DCL. It changed my professional life. I had spent several years adding code for debugging. It was always important to comment and disable that code before it went into production.

I loved the Vax debugger. Stepping line by line, examining the contents of variables, depositing test values. It cut my debug time in half.

The Altair and other computers were harder because it was machine code. I would have benefited by learning that before using Fortran and PL1.

I’ve got an arduino-based emulator/replica of an Altair 8800 - I’ve had huge fun with it - almost tempted me to learn 8080 machine code just so I could toggle in a program completely from scratch

I feel the same way. Except it’s not 1979 and resources are limited. There was a lot of information in computer magazines that included code that could be studied and run. Hobbyists shared information and fueled interest in early computers.

Colleges offered Math and Science students computer courses and labs. I remember students had to wait in line to get their punched cards read, compiled, and executed.

A emulator would be fun to explore a Altair 8800.

All the aforementioned systems (including IBM mainframes) have emulators.
E.g.
https://www.s2js.com/altair/sim.html

Here is an Analytical Engine simulator, for that Old School experience:

holy crap and I thought writing games on a ti-99/4a was obnoxious …of course after that I decided to make my own programs for my VIC 20 and later a C64 which was even worse …

Is it true that bill gates wrote a music program that’s considered one of the first computer"programs" as we know it? in fact, supposedly before Microsoft, he was famous among computer people as one of a few people that could actually do anything useful on an Altair …