2 1980's Japanese electronics questions

In the 1980’s, here in America at least, Japanese electronics (I.E. computer, TV, and stereo equipment) were all the rage, and it was usually considered superior to American technology.

I heard two things about Japanese technology back then, and now, 25 or so years later, I’m wondering if they were true.

  1. Part of the reason for Japanese success was because after WWII we helped them rebuild or replace damaged factories, so they had newer facilities, while ours were decades older.

  2. For the most part the Japanese didn’t invent very much. Instead, what they did was take American products, improve up on them, and then sell them back to the US.

Do these sound about right?

The Japanese take the long view in business, always, and are great imitators. Instead of rushing products to market, they gradually eased in with cheap prototypes, improving them as sales caught on, and eventually taking the lion’s share of the market with many products. An example: when Subaru first came out, it was an unmitigated piece of shit. Now it’s the vehicle du jour for moms everywhere. Nivico (Nippon Victor Company) was of course the RCA offshoot based in Japan. The Japanese used what they saw there and began producing their own comparable products for less money. Eventually, they became known for their quality and reliability.

Japanese product reputation used to be what China’s is today. Over decades, their sales put many US companies out of business.

While US manufacturer’s gave William Demmings’ Statistical Process Control lukewarm reception at best, the Japanese embraced it whole-heartedly. This gave them a quality edge that the US is still trying to match.

That seems to validate the second item, so thank you.
Now I’m curious to find out if our rebuilding Japanese factories after WWII helped to give them any advantage.

I do remember hearing about that a long time ago, but I completely forgot about it till now.

Well, a factory that had been built from the ground up in 1950 would be 30 years old in 1980. The building might be in better shape than, let’s say Ford’s River Rouge factories (built 1917-1928) but without up-to-date equipment inside, and especially without the management processes needed to make use of newer technology, a shiny new facility wouldn’t be any better than one 30 years older.

Also, as Chefguy pointed out, many Japanese products started out bad and then improved, even as the factories got older.

It’s not so much that we helped them rebuild, it’s that after the war, manufacturers didn’t have the same cultural baggage about how plants and businesses should be run and outdated equipment that the US does. If they’d had the option of using the same old machinery, management techniques and such from before the war, they wouldn’t have gotten ahead like they did, no matter who paid for rebuilding their industry.

Some folk have theorized that’s why the US seems to have fallen behind the rest of the world - we haven’t had to rebuild our culture, economy or infrastructure in who-knows-how-long (since the civil war, maybe?).

Excellent point.

Globalization should help fix at least some of that.

No cite here, on the phone, but I remember a that part of the agreement ending the war, they were not allowed to maintain much of a military, so they could focus their tech into consumer electronics.

Me

Japan was also able to dump its money into infrastructure and manufacturing, since their military was not (and is still not) a priority. Superior Japanese cameras and electronics were appearing on the scene in the 60s, much earlier than the OP indicates. I bought a Minolta camera and an Akai tape deck in 1967 that were far superior to what the US was producing at the time. Korea has also followed this model with products like Samsung and Hyundai, once the butt of many jokes, gaining market share every year.

Scooped by RememberMe, I see.

Beaten to the punch. I just had this conversation in the car the other day with my son, and attempted to explain to him who Demming was and how he helped Japanese manufacturing.

Another factor in the Japanese mfgs. success-despite the fact that the transistor and IC were both invented in the USA, TV and radio mfgs. here had a tremendous investment in old technology (i.e. vacuum tubes). RCA, GE, and Westinghouse were still using tubes, long after Sony converted to solid state designs. This was logical on the part of the Americans-they were able to use their obsolete tube plants to make profits, but the superior transistor designs were better, more reliable, and easier to maintain. Plus, tubes burn out-while a transistor will function for decades. This made the US electronics obsolete by the mid-1970’s-many gave up and just imported Japanese made sets under their own labels.The Japanese also made great strides in manufacturing-they adopted surface mount technology (SMT) faster, and also used their superior Quality Assurance programs to drive costs down. Case in point : the American firm AMPEX invented the fiirst portable VCR-it cost a ton and was the size of a filing cabinet. Sony brought out a better unit, much smaller and for 1/3 the price. Ampex is long gone, Sony survives. It wasn’t low wages or brutal working conditions-the Japanese simply made better choices.

The integrated circuit was invented in Britain or Germany, depending on how you count such things. It was made commercially viable in the US. Same thing with the transistor (except substitute Canada for Britain).

The transistor was invented by Bardeen, Brittain, and Schockley at Bell labs. As for claims about IC invention, the actual first one was made by Sprague Electric Corp.(North Adams ,MA). The man usually credited (at TI) actually made a hybrid IC-his flipflop circuit had flying wire connections. Zenith Electric (Chicago, USA) made crude ICs back in the early 1950’s. So there were a lot of people working on the concept.

All of the above points are valid, but I think there is another bit to it:

In the late 1940’s through the 1950’s lots of japanese and american teenagers were into ham radio and model airplanes. Even the rough kids were into things like hot rods, and every high, and jr-high school still had shop classes. In the 1960’s many american teenagers found other things to occupy them while more of the Japanese kids stuck with nerdier pursuits. Outside of IC design, all the best EEs I have known have been ham operators. There is no replacing teenaged passion with book learning.

AND we were using our top nerds to build 10 moon rockets or 100 missiles for $$$$ not a million transistor radios for $/10. Yes, we developed a lot of technology, but few engineers were learning how to design for cheap mass-production. Yes, a few were, but you need perhaps 30 trying it for each successful one.

I would just like to point out that, even in the 1980s, Japanese computers, no matter how well-built, were by-and-large running operating systems and software from American companies.

Japan had home-grown OSes and software companies, they had generally limited success, and now every Japanese PC is running an operating system from either Redmond, WA or Cupertino, CA. (Virtually every one… Linux is international.)

X68000’s used Human68k developed by sharp, PC-98’s which owned 60% of the PC market in Japan in the 80’s didn’t use DOS until 1990, so I am not sure that it is true that as a general rule Japanese PC’s used American OS’es in the 80’s. FM Towns which used a DOS derivative never dominated Japan.

I started in the electronics business in 1980, and I saw this first hand. By that time the Japanese were far more interested in quality than we were. American board factories had chip testers to test a sample of incoming parts, since you could depend on their quality. Our factory in Little Rock tracked lot numbers for Intel memories, since they discovered that if you rejected a lot and sent it back it might wind up in your loading dock again.

Sometime around then the Japanese told us - specifically HP I think though we were all guilty - that our stuff was crap. They cared about even seemingly trivial things. Back than components had leads which you stuck through a circuit board. They wanted these to be cut down to all the same size on the reverse. It made one pay attention to detail.
It took a while, but we got the message. No one does incoming inspection any more. PC makers expect failure rates for microprocessors at board and system test to be 50 parts per million or below. It is a very different world.

I started working in electronics full time in 1985. We had a board tester, that is a machine that used a bed of nails to 100% electrically test (or as close to it as you could get) every single bare board we received, because we couldn’t rely of the board makers to do that for us. Bizarre to think like that now.

Good thing we were mostly talking about double sided boards, and definitely still thru hole.

After the boards were stuffed, wave soldered, lead trimmed etc. they were in-circuit tested to make sure the right components of the right value were in the right location.

Then they went to a functional board test. Then they went to a burn-in oven for early life screening. Then they went to a series of high speed/traffic tests. Then they went to a manual type of functional test and finally, during configuration for the customer, they were given a systems test. Wow.

If they failed at any of these test gates they went to a technologist, like me, to debug the board and have it repaired by one of the many repair ladies on the floor and returned to the test process.

We had to test the crap out of everything since we had no other way of determining whether or not we were shipping reliable product.

There is probably some overlap in philosophy that extends to the automotive field too, I’ve heard that the same car is better in Japan (and not just ones where the JDM car uses different and better parts) because all pieces at the Japanese factory have to EXCEED specifications, but at the American factories the parts just have to MEET specification.