Where would computer technology be today if we had never had the cold war?

The internet was born out of ARPAnet, which was a cold war defense project. Up until the fall of the Soviet Union, one of the biggest demands in building more powerful computers was for missile defense.
What if the Cold War never happened? Where would these technologies be today in that scenario?
IMO, if computers were able to get to the stage of playing games, there would still be rapid development. Gaming capabilities seem to be the computer industry’s version of an arms race.

I would guess 10 or more years behind where we are today because high density integrated circuits would not have gotten cheap enough fast enough to justify so much early development.


As for the debate, I’d say that there would be very little difference.
Sure, some development was driven by the military, but most of the semiconductor industries’ progress was driven by consumer and other industrial demand (telephony, for example).

I don’t have a cite but I remember learning about that in my computer science classes. The military and DOE had many supercomputers built for nuclear simulations. This was IIRC, where Seymour Cray got his start, was designing computers for the lab at Los Alamos.

I don’t think so.
Cray was an ex-CDC guy who went on to found his own company. The fact that his computer was bought by Los Alamos doesn’t mean that the defense industry funded the research.

Human memory is notoriously poor. Your “memory” is not a cite.

Military super computers did not lead the development of the industry. Instead they were simply special applications of commercially available components. Their quantities were far too small to draw the interest of manufacturers.

Modern computer technology was driven by development of mundane, high volume products. In 1978 the Fairchild F8 emerged as a viable controller for commercial products. It was soon joined by National Semiconductor’s COPS line and Motorola’s 68xx devices. These were used in microwave ovens, car radios, hand held toys, appliances, bomb fuzes etc. They were called Microcomputers at the time along with Micontroller and Microprocessor. The minimum purchase quantity for Microcomputers was 50,000. They were sold by the millions per year.

Moore’s law, operating under commercial pressures, allowed ever greater levels of integration until by 1998 the controller had absorbed much of the system and operated at speeds well beyond prior ‘super computers’.


Which was handled by Berkeley wonks. Yes, *those *Berkeley wonks, with the hair and the cardigans and the peace symbols. In turn, the structure of ARPAnet stricly followed (and, really, built upon) the primitive inter-university networks that existed already.

The key difference with ARPAnet/internet is its dynamic aspect, the fact that it doesn’t matter if a server, or indeed an entire country, suddenly vanishes from the web - your requests will automagically reroute themselves using available pipes. This is the only aspect that owes directly to the Cold War and military involvement in the project - they needed that in case The Reds nuked this or that C&C center. But it’s not strictly speaking necessary for a global network to function, nor is it a specifically military criterion - power stations blow up on their own, cables get cut, university servers get violently disconnected to plug in the vacuum cleaner (true story). Redundancy/dynamic rerouting makes sense on its own merit.

Internet has little to do with supercomputers, nor does it require any.

The military requirement to second source electronic components was a major factor in the expansion of modern electronics. It promoted competition and kept the technology from being locked up with patent and copyright protection. The military build up in the cold war certainly affected that decision. OTOH without the massive military spending the private or public sector may have spent that same money in other ways prompting the same kind of rapid development. The time table would have changed somewhat but there’s no telling if consumer electronics demand in a better economy that wasn’t dragged down paying for the cold war costs might have spurred even more rapid development. It was technology and the opportunity to exploit it that drove this process, no particular motivation could clearly be seen as superior.


To the contrary, the military/aerospace industry used commercial products that were tested to their specs.

The LEM computer that went to the moon was made with standard IBM commercial components. The FU-139 fuze had the same controller that Mattel used in toys.

Mil/Aero was a minor market that the industry served out of respect for the National need and the premium dollars it provided.


Don’t know if this link will come up on the same page in the book.

The second source requirements existed and certainly spurred competition and growth of semiconductor suppliers. Private industry picked up on this and became a stronger adherent than the military.

I’m old enough to have been around when computers weren’t a thing people owned. I don’t see any reason why their development was inevitable. Investors weren’t going to start up companies to build thousands of computers (let alone millions) in the hopes that when they did so, customers would appear.

So I feel the military did have a huge effect on the computer business. It was a single customer that bought enough products to justify creating an industry to supply them. And then once that industry existed, it sought out other customers and developed computers for individuals.


Thanks for the link, but it is counter to your argument. The VHSIC program was costly and yielded nothing. Commercial pressures caused the industry to race ahead of it. Second sourcing of mil/aero product was never a factor. The opposite is true. The industry developed reliable, inexpensive encapsulation techniques in order to lower costs. Plastic encapsulation survived 50,000 gs much better than the military cavity package.

The military had a standard computer definition for a multi sourced single chip computer. Some companies built it as a custom but it was commercially obsolete before the spec was released.

Can you name a commercial semiconductor device that became popular as the result of mil second sourcing?


Little Nemo,

IBM built the first commercial computers (IBM702, 704 705) for a market forecast of 14 units. The demand for the 704 resulted in something like 175 units being shipped. In response to that market demand IBM developed the 709 and 7090 then the RAMAC disk and the 360 series. With that the industry was born.


Semiconductor manufacturers didn’t license competitors to produce their products out of the goodness of their hearts. It was done by demand from the government and industry. Competition lowered costs and aided the development of consumer technology. Whether it led to the success of any particular device doesn’t matter, it created demand for standardized parts like 7400 series ICs and spread technology and encouraged it’s growth through competition. We would not have as many companies developing semiconductor as rapidly without the second sourcing effort.

Second sourcing was seldom an issue. If the military wanted a proprietary product they could take it or leave it. They usually gave us a waiver and took it.

Second sourcing is an outdated concept. If the problem was ensured supply we’d agree to run their product on more than one line.

Ok, I’m not arguing with what you say. I am stating that second sourcing did exist, it resulted in manufacturers licensing their product to other manufacturers who grew and developed their own technology. For several of these companies I know that the second sourced products were their bases of profitability. I don’t believe that would have happened without the government second sourcing requirements. I’m not saying that is the only way that technology could have developed, I pointed that out initially, but that is what did happen.

Also, at least in private industry, second sourcing definitely led to competitive pricing in an open market. If you were price sensitive, and industry rarely isn’t, you would purchase at the lowest possible price, and without competition from multiple suppliers that would have kept the price of all products higher. That definitely would have slowed the development of consumer devices. Here in the 20 somethingth century major companies can sponsor the development of any component they want, but the history leading up to this was a result of smaller companies breaking ground on lower budgets. Those smaller companies might grow as did Apple, or might only test the market to be taken over as happened with the IBM-PC. And the growth of PC clones would not have occurred even if Bill Gates did snooker IBM if the clone manufacturers couldn’t built their machines from low cost components.

I don’t know if I could find a cite to prove what I am saying, but I think the logic is clear, second sourcing was a major factor in a competitive market at the component level that was necessary for the era of development of consumer electronic products being discussed in this thread. If I’m wrong I’d need a cite showing otherwise to convince me.

Without the cold war, the impetus for the earliest computers such as ENIAC and Whirlwind I might not have been there, and their applications (atomic weapon calculations, among other things) certainly might not have been so vital.

That’s the point at which the lack of a cold war might have significantly changed the timeline of computing, IMO.


Those computers were academic novelties. IBM SAGE was the first full sized computing effort for the military. It came complete with the multi-story building that housed it.

There was a continuum of development at IBM that led to full sized commercial computers. From the mechanical 402, 405, 407 to early experiments with CPC (card programmed computer) using back to back 407s, to the decimal 604 and bi-quinary 650, to the variable word length BCD 705 and the binary 704. None of this had anything to do with the cold war other than the military was just another customer. 704s were used at Ft. Huachuca to keep track of supplies.


So the industry wasn’t born with Colossus and ENIAC and BINAC and Whirlwind?