Computers - my how they've grown

How did you enjoy Pascal? That was the first programming language I learned and I never much liked BASIC when I tried it later, but I also knew people who started with BASIC and couldn’t seem to get on with more structured languages. When I later got the computer mentioned in the OP one of the first things I bought was Turbo Pascal.

Pascal was a kind of revelation for me as the first real procedural language I’ve learned. As you might know, C-64 Basic was really restricted, the only major structural orders consisting of IF…THEN, GOTO, FOR…NEXT, and GOSUB, a sad excuse for procedural programming. Pascal had real sub-routines and functions, and other more sophisticated features like variable declarations. That was also a mess in C-64 Basic. Oh, and differentiation between global and local variables.

I enjoyed it quite a lot. My group in grad school developed a Pascal compiler for the PDP-11 in 1974, and I was the designated Pascal teacher for my adviser’s big classes that used it. I could teach the entire language in two lectures.
Plus the language I developed for my dissertation was an object oriented version of Pascal (long before C++) which I implemented by hacking Jensen and Wirth’s Pascal compiler.
I recognize its shortcomings, but Pascal is a great teaching language.

My first logic lab assignment was to implement one of these. Fun!

I forgot to mention the best computer system of all - PLATO, In 1975 I got to use

email
Instant messaging
Chat rooms
Message Boards
An online newspaper which I wrote for
Multi-user dungeons
Multiplayer real time games
Touch screens
Very decent computer graphics

All decades before the rest of the world got this stuff.

ZX Spectrum for me, 48k, sweet. £129 in 1983 and it was a thing of beauty and wonder. So many games, so much squeezed from so little. I also bought the “microdrive” soon after. A looped tape storage that meant I could load up “the hobbit” in mere seconds.

mind

blown

My first computer was a Franklin ACE-1000, which was an Apple ][+ clone. We bought it with an 80-column card. Eventually we had two floppy drives, a 2400 baud modem, an 80 column green display, a 40 column color display, and a dot matrix printer.

My parents have said the $2000 or so they spent on that computer was by far the best investment they ever made in my and my brother’s education, as we both now have jobs managing computers.

My first computer experience was a USAF analogue vacuum tube/servo system that roughly simulated the B-25 aircraft. The instructor sat in the chair and monitored the student in the cockpit:

Just when I left the USAF, IBM was looking for technicians to service their new 704 and 705 digital computers. They sent me to their computer school and a year later another school on the 709. Their curriculum was to teach every machine IBM had ever made. The last four weeks of school we worked on the assembly line debugging initial power up on main frames. With +_ 250 volt power supplies, that was exciting. Tune for maximum smoke was a common tactic.

At the time the computers were leased rather than purchased. The customer provided the facility and an operator. IBM provided all equipment, spares and a resident maintenance crew. Their were 6 of us for the two machines at Lockheed Burbank, 704 and 705, and later 3 of us moved to JPL Pasadena to support their double precision (64 bit) 704. After 3 years in the field I was transferred to the Advanced System Development Division design group in San Jose CA.

By dumb luck I’d got drafted and ended up with a million dollar education and a dream job. It was a different world.

Is this basically about Moore’s Law (what is its future?) What about fundamental paradigm shifts/improvements in the architecture of computers themselves, going forward? (Newer computers are faster than older ones because of a whole lot of improvements that can hardly be glossed over, obviously, while at the same time things like binary arithmetic and von Neumann architecture remain recognizable.) The same can be asked about the future of software.

Computers that take up an entire room (and the resultant need for power and cooling systems) still seem to be a thing, at least if you want a fast enough supercomputer.

The talk of modems got me to thinking. I would never go back to that… BUT, there was something about waiting until the long distance dropped to night/weekend rate and then you heard the modems negotiating. Being always connected just doesn’t give the same endorphin kick.

I forgot to mention that the Univac I was all vacuum tubes. They had someone patrolling the room with a box of tubes to replace the burned out ones. The arithmetic registers were triplicated and if at least two of the three agreed on the result that was accepted. If all three were different it was repeated. Any time an arithmetic operation overflowed, the machine took the next instruction from memory location 0 and it was up to the program to put there a pointer to code to do something about the overflow.

Moore’s Law is pretty much dead, and I saw lots of charts at the conference I went to last week testifying to that. Clock speeds have not increased much. The reason is technical and economic. With fewer foundries that can build cutting edge chips, and the gigantic cost to build a new fab that can handle a new process, it no longer makes sense to follow the Moore’s Law path.
What is being done is using chiplets - relatively small pieces of silicon which are not packaged but put on substrates. For instance, instead of making a chip with 16 processor cores you can make 8 chiplets of 2 cores each. Yields go up, costs go down, and you can increase speed with more cores not expensive moves to smaller geometries.
Systems will get bigger and have more capacity, but chips won’t.

There are coffee machines today which have significantly more computing power than the Apollo guidance computers.

Your “pocket computer” is more powerful than a 1970s era supercomputer, and makes the 1960s era supercomputers look absolutely silly.

I posted this in another thread a few months ago:

The 1975 Cray 1 supercomputer - it really was “super” for its time and widely recognized - was larger than a phone booth and did 78 MFlops. An iPhone 5s did/does 80 GFlops – 1000 times faster than the Cray 1. And we’re several iPhone generations past that.

My first machine was a 1977 Apple ][+ that my father bought when I was 15. I learned Basic and assembly language on it, cracked copy protection on games for fun. Got an Apple IIe when it came out and wrote a MIDI sequencer for myself since I couldn’t afford to buy a MOTU one for $150.

When I was in high school I’d programmed an HP 2000 mini via 300baud modems on butcher paper Teletypes. That was tough, I preferred the Apple II.

My current personal machine is an AMD 5800X with 7TB storage, 64GB RAM. GPU is last gen NVidia 1080, I don’t do that much gaming. I built this box at the beginning of the year and with NVMe storage it is stoopid fast.

The iPhone 14 is about to (or has just) come out. :slightly_smiling_face:
I am reading this on an iPhone 6s. And that was a hand-me-down from a friend who upgraded.

No matter. I am planning to get a middle-of-the-road iPad Pro: 1 TB of storage, 16 GiB of RAM, 2.5 TFLOPS.

One of the odd things is how the TI994A expanded via a port to the right. Here’s what a fully expanded one would look like:

My first computer was the TI994A. And I was able to hook up a cassette player to it. The only thing I remember was my BASIC program that did power rankings for NFL teams.

And I had a job in the 80s (and some of the 90s) as a computer operator on a mainframe computer. I’ve wondered how the existing laptops compare to the mainframe computers back then.

Relevant XKCD “What If?”

https://what-if.xkcd.com/31/

Someone correct me if I’m wrong: the things you’re saying weren’t specific to Commodore-64 BASIC. They were the way the BASIC language was back in the day (i.e. the C-64’s heyday), though later versions of BASIC came along and added more features (e.g. more support for procedural programming).

Some of those optimizations were new for the C-64.

The C-64 used a version of Microsoft Basic, which was in continuous evolution from its humble beginnings (4k Altair 8080 basic on paper tape) to Visual Basic .net. I estimate (from personal exposure) that there was about 80% commonality on all platforms, but each specific target often had some optimizations or adaptations. The major feature jump was, AFAIR, QBasic that came with later versions of MS-DOS, since that version changed things like no longer requiring line numbers and providing more modern program structures.