Is US Military tech 10-15 years ahead of private industry?

I was. But to be fair, there is a lot at play here, and there will be many layers of meaning and messaging in these sorts of statements. It isn’t in Taiwan’s interest for the US fabs to undermine reliance on them. There is a whole raft of expectation setting and politics. So I would hesitate to take this only at face value. OTOH, new workers at a new US TMSC plant are not experienced workers at Intel.

I hadn’t seen that. Reminds me a lot of a British sahib in the 19th century complaining about the lazy natives.
People at Intel don’t work 12 hour days either, as a rule. Unless you’re in a project that is behind, like I was. It didn’t seem to work out very well.

What you call a node is more marketing than reality. Like I said, we were bleeding edge and often the first chip through a new process. Yields were pretty low, but good enough for us, so you might not call it mass production yet. And I dealt with very early production used to build and prove in our products which used our parts.

Over 25 years ago, the joke was that the point of the SIA Roadmap was to tell Intel what they should be doing five years earlier

I like that. A major goal of it was to make Noyce’s law hold true. I was at a meeting - not part of roadmap discussions - where this happened.

Thanks! Based on zero personal experience I would have gotten this wrong. I am glad you posted-there is no substitute for actual experience. I associate with people who are far smarter more experienced than me. This will keep me from displaying my ignorance (I do that often enough as is :slight_smile: ).

This video recently popped up in my YouTube feed.
It details the MP944 microprocessor chipset, which was a classified system designed as the air data computer for the F-14. It was significantly in advance of the 4004 - which is generally regarded as the first commercial microprocessor. The video goes into some arguments for and against the question of what is a real microprocessor. But no matter what, the MP944 wiped the floor with contemporary microprocessor technology. But it wasn’t 10 years ahead.

I worked with a guy who was testing GPS in the late 70s, about 15 years before most heard of it in the early 90s.

I think the real lead the military had was in the predecessor system to GPS, Transit aka NAVSAT. That had its genesis in the late 50’s and was flying in the early 60’s. The down side being that you needed a refrigerator sized computer to navigate with. Things got better of course, and eventually GPS replaced NAVSAT.

GPS became available for civilian aircraft use in 1983. Albeit at eye watering prices. From becoming operational to civilian use was perhaps only 5 years. I remember talking with someone from a local semiconductor design startup in about 1985 and suggesting that they seriously look at what I saw as the coming mass market. I still think it a great pity that they weren’t in a position to do so and went under not long after.
I talked my father into buying a Sony Pixys receiver in 1991. It is still in a drawer somewhere. And of course the first Gulf War showed how consumer tech had outstripped military in some respects as families of soldiers in the conflict scrambled to buy and send out civilian GPS receivers to their boys. That was about 15 years lag.

But, no doubt. GPS was way ahead of anything any civilian tech by itself could conceive of. Not so much because of the receiver technology, but the cost of deployment of the satellites. That is only really shifting now. Even though Galileo is notionally largely civilian, in reality the only way to get the EU partners on board comes down to national security and the desire for a sovereign capability. The speed at which semiconductor tech moved to close the gap from high priced bespoke receivers to cheap commodity availability was remarkable.

That you can buy an IC the size of a grain of rice to do GPS/Galileo/Glonas navigation inside your phone is testament to how far we have come.

I have heard that in the 1960’s military C3 technology was a couple of generations ahead of anything available to civilians. Which kind of makes sense, because while there are obvious military necessities for such things most people had not yet even realized they all definitely needed to be on the Internet, plus obviously the budget thing.

Many of these examples, like GPS and the MP944, are 40-50 years old. Is the military still ahead of civilian stuff today? For example, do they have a neural net processor better than anything Nvidia can come up with?

Some technologies and systems have only military applications, so it doesn’t answer the question to compare a pickup truck with a propane powered potato gun to an M1299 (or whatever, that is just an example).

Drones are something that have civilian and military uses. I’m sure military drones are better at carrying hellfires, but what about something like reconnaissance/surveillance where there are both military and civilian applications?

The internet was of course birthed from 1960s military C3 technology, DARPANET.

Well yes, the point being that DARPANET was obsolete enough that information about its technologies would not be directly useful to get a handle on compromising what the military was actually using in critical installations by then.

DARPANET became the internet, it wasn’t obsolete technologies.

How did DARPANET evolve into the Internet?

Over time, DARPANET expanded, connecting more universities, research facilities, and government institutions. In the 1980s, the National Science Foundation (NSF) developed a network called NSFNET, which interconnected various academic and research networks. As these networks adopted the TCP/IP protocol suite originally used by DARPANET, they became collectively known as the Internet. By the late 1980s and early 1990s, the commercialization of the Internet began, paving the way for the World Wide Web and mainstream adoption of the Internet we use today.

Manned can’t persist as well as unmanned:

As far as ahead of commercial, they’ve had “full self driving” for a few years.

The military was not using DARPANET (or a very close equivalent) for, say, nuclear command and control [were they??]. What were they using? [Projects like SAGE air defense were already operational in the 1950s, packet-switching networks were introduced shortly afterwards.] No idea; you would need to ask a military historian. In fact, the dude who told me this story in the first place struck me more as a hippie than as someone who would have been happily doing top secret work for the United States Air Force in the 1960’s–70’s, and I would not necessarily take his word about any of this without further research and confirmation regardless of judging by appearances.

It was well understood at the time - and I used the ARPANet in grad school and even took a seminar from Prof. Licklider at MIT, that the military justification for the ARPANet was just to get funding. Sure it did have some use for the military, but nothing like GPS which was primarily a military project. The foundations for modern GPS was set at the Pentagon at a meeting over a holiday weekend, when the air conditioning was off and they had to get their food from vending machines.

Current military tech: https://twitter.com/peo_stri/status/1753114078900298130/photo/1

Current civilian tech: Apple’s Vision Pro Headset Shows the Future of Computing Is Bulky and Weird | WIRED

Not even close to equal. Certainly not 10-15 years ahead.

Extensive new article on the US-Taiwan culture clash at TSMC.