Is this kinda sorta how computers work?

God help me, at first I thought you were joking about flying spaghetti monsters…

Well, like I said, computers don’t actually function that way any more. But, if you look at how a simple CISC style machine worked way back when, it’s just a simple finite state machine that goes through a loop of states. Then you realize that a modern RISC architecture, while looking quite different, is just an unrolling of that loop so that each state in the CISC version now has its own dedicated hardware in the RISC version. I think it makes it a bit more obvious as to where the various pipeline stages of a RISC come from, but that’s just my opinion.

How far back do you want to go? I wrote a simulator for an 8086 when it came out, and I don’t think an FSM version would have been too useful.
Perhaps you are confusing architectural states, such as instruction fetch, instruction decode, instruction execution, with states in the FSM sense, which are the possible combinations of storage element values. In the latter sense, each combination of register values and program counter values represent new states, so you get an explosion even for very simple architectures.

Never thought about the overloading of the acronym. Though I have seen some spaghetti code in RTL.

To add another turtle, there was a company called Nanodata which sold a machine which allowed nanoprogramming - you constructed your microinstructions out of nano-operations, and your instructions out of microoperations.

IIRC, one of my qual questions was about the utility of picoprogramming. The correct answer was that this was a bad idea. The market said that nanoprogramming was a bad idea also.

I remember that. It was about the time I decided to get out of hardware. Programmable logic sounded great, but the speed and density of Intel and Motorola processors was making it pointless. You could prototype something that wouldn’t have been economical to produce, and a stock product could do it faster from the outset.

What was the distinction between nano and pico programming?

What really did it was the death of the minicomputer in favor of computers based around microprocessors. Microprogrammed machines were good in that the hardware implementation was easier at a time where things didn’t scale very well. In those days all CPUs were custom. When you could pop in a CPU bought as a commodity, everything changed. Computers today are a lot less diverse than they were back then.

The effect you cite killed the special purpose hardware engines, like simulation accelerators which were popular in the early 1980s. While you started with a speed advantage (at a high price) if you picked a standard computer route you’d get speed up for less money without even having to recompile most of the time.

The handwriting was on the wall in 1980, and when I got my PhD I went into another area, though I continued to write some papers and be involved in TC-Micro for another five years.

I don’t remember, since it was not a serious proposal. It was basically having control signals for much finer logical operations. It was expected that an architect could figure out what was meant without a lot of background.