I am an electronic hobbyist, and the bolded sentence does not make any sense to me. The original Breakout arcade board had some chip count <50, got it. By eliminating redundancies in the hardware etc, Woz reduced the number of chips by 50, got it.
“Design so tight that it was impossible to reproduce on an assembly line” does not make any sense to me on it’s own. My best guess is someone non-technical heard some factoid about how impressive the chipcount reduction was, got confused, and posted it on the wikipedia. What could they be talking about?
That whole paragraph apparently has six citations (those numbers there at the end). If anyone got confused, I’m pretty sure it isn’t limited to Wikipedia, and that it isn’t an invention of the editors.
It was the chip count that mattered, not their density on a circuit board, if the story is accurate at all. It doesn’t make much sense. There wasn’t that much automated PCB construction at the time either, but I don’t know what Atari was using. The sentence doesn’t really make sense as written.
He might have just included a lot of discreet components, reducing the chip count, but not the component count. The component cost may have been less, and the construction cost more as a result. Or he was getting ripped off. It’s similar in nature to Tesla’s conflict with Edison.
Density on circuit boards do not matter in an arcade environment, space is at NO premium inside the cabinet. My Mortal Kombat arcade board is 2 main boards sandwiched together and then a third small sound board on top, the entire thing is bigger than many desktop computers. There is no way that if the board was too “dense” in the 70’s that they couldn’t just scale up the PCB in total size so that their manufacturing process could handle it.
It makes no sense to me how their assembly line could somehow handle a 100+chip PCB, but not be able to manufacture Woz’s 40-something chip PCB, and the quote as written is gibberish that sounds like a non-techie restating something they heard but didn’t understand.
Tight may not mean space. It may have been very tight timing. It may have worked fine when built with chips from the same batch, but in production tolerances on the chips may have meant that many complete systems simply didn’t work because the variation in gate delays and other related issues difficult or impossible to manage in a production environment. Some of the old arcade games were not really pure digital - you can use the fact that the input of a gate is a Schmidt trigger or just a comparator to good effect, and you get some very tight and evil circuits. It would not surprise me that the design made use of these idea too.
There are stories that the original CDC 6600 was designed and hand built by Seymore Cray, and that whilst it worked, it took another 18 months to make it work as a production design, as it’s timing was so tightly tuned. (The 6600 series was filled with colour coded lengths of twisted pairs all neatly wound into hanks. They were the individual signal delays needed to get the beast to work.)
This story has been told and retold many times interviews with practically everyone involved. Here is what Al Alcorn, Steve Jobs boss at Atari, said in this interview. "
So it would seem Steve Wozniak says it was because Atari didn’t understand the design, and Al Alcorn says it was because Atari didn’t understand the design. The missing link in the story is that Steve Jobs was subcontracting the work to Wozniak without Atari’s knowledge. There was no direct communication between Atari engineers and the engineer who built the board.
This was typical of early mainframe construction. The racks for large Burroughs systems were often arranged in a star formation around the CPU so that all cabling between the components was the same length. But a Breakout arcade game was much smaller and operating much slower than those machines.
I think Crazy Horse has a better description of the issue. This was typical of Woz’s early designs, using a few discreet components to replace chips, and using far less gates to do the same job as a conventional design. When the Apple ][ floppy disk drive arrived after a year of waiting, people receiving it thought they were missing some electronics because the interface and the disk drive had such simplistic electronics.
ETA: In a sense you are correct, sometimes his designs were based on close proximity of components to get the timing right.
I can perfectly believe that Woz was better than early Atari engineers and that they couldn’t understand his design, and thus it wouldn’t have been cost effective overall to sell his Breakout board even if it was cheaper to manufacture…Arcade operators who have one chip on the board go bad and are told by Atari to buy an entire new board for thousands of dollars would NOT BUY ATARI anymore. Atari and the arcade ops would be better benefitted by a board design serviceable by techs that they could write repair manuals etc for, which if they don’t understand how the design works (i.e., what chip does what) then when something goes wrong, their only response would be “replace the whole board”.
I’m going to believe that THIS is the reason they did not use Wozniak’s design, since it actually makes sense and is not gibberish, but that is not even in the same SOLAR system as " a design so tight that it was impossible to reproduce on an assembly line." That just does not make any sense.
Yes, one of Woz’s great ideas with the Apple was to realize that with programming, the CPU could do the same job as a huge collection of dedicated disk controller chips. While rivals like TRS-80 and Commodore were selling add-on diskette drives for huge amounts (the Commodore drive unit cost more than the computer, about $1500) the biggest expense for Apple owners was the diskette drive unit itself, about $250.
Reading between the lines, Woz did the hardare equivalent of spaghetti code programming. It was too complex for the Atari guys to understand, so they ditched the design. (Repair costs vs. board replacement, as mentioned above) Possibly also was the problem of writing the diagnostics; “if this happens, it’s this chip”.
If the design is too dense to understand, and one faulty component can impact a lot of diverse functiions, how do you write the repair manual?
I am going to with what Crazyhorse had sourced. Since Jobs had sub-contracted the actual job to Wozniak he himself had no idea what was going on. Plus, I am sure he did not ask Wozniak for any technical documentation at the time, thus compounding the issue. I can only imagine that meeting where Jobs presents the reduced board but has no idea what modifications have been made and how anything works. I am sure Atari was not pleased.
What would be really cool would be to see what Wozniak did with the original Atari board as far as component reduction was concerned.
It doesn’t even to be a lack of understanding by the Atari engineers. The original hardware design for an Epson Z80 PC used American ingenuity design. The Epson engineers wouldn’t accept it because they calculated some timing failures if every gate had the maximum propogation time. Highly unlikely, but they wouldn’t do it with the possibility of 1 in 10,000 boards failing and upped the chip count. They added a lot of logic to maintain wait states, effectively slowing down the processor. But in the end the major failure was using an 8 bit processor in the 80s when 16 bit was taking over.
Tri,
I’ve never dealt with an office equipment/personal computing device (except furniture) that had a failure rate lower than 1 in 10,000.
Is there some reason 1980s printer manufacturers were so concerned about QC?
Ok, first I don’t recall the actual failure rate that was tolerable for them, 1 in 10,000 is an approximation.
Epson did have high quality standards. They were using their standards for electronic circuits. All logic had to operate in between the extremes of the propogation time based the stated numbers for the chips. So if a signal could travel through 10 gates each with a 10ns max propogation time, but 5ns average propogation, and the signal had to get to the end in 95ns, that wouldn’t be allowed, even though it would be unlikely to have the total propogation time exceed 50ns by much. Nobody would allow such a device to be made in volume now though. But in the 80s there were still some designs out there that hadn’t been analyzed to that extent. I don’t know how many times I saw a circuit fail because one component was a little slow, but still within tolerance. I had my hands on most of the PC designs brought to market in the 70s and 80s. It was very common to see a 7400 quad-nand chip to be used as a square wave generator for the CPU clock. Usually worked fine if the ramp in voltage was steep enough when it changed states. But every once in a while a chip didn’t quite behave like that, and the clock signal got a little wild. Now-a-days nobody would repurpose a chip like that, they would use a device specifically designed to be a clock generator. There were clock generators back then, but they usually cost a lot more than a single 7400 chip.
Not sure I can comment on the question in the OP itself, but the anecdote makes Jobs seem like a real dick. Not only did he keep most of the money, but he also neglected to mention to Atari that he sub-contracted out the work.
He was one, in that case. And others. I figure Wozniak forgave him when he bought his first island villa or private plane after co-founding Apple, though.
Were it not for Jobs shrewdness Wozniak’s inventions may have been doomed to the back of hobbyist catalogs forever, and were it not for Woz’s engineering brilliance Jobs might have found himself without much to sell, and no capital to continue with what became his prolific career.