Remembering my school shop classes, when I was lucky if I could cut or shape anything to within an eighth of an inch, I can’t imagine how manufacturers get everything perfectly straight, flat, level, and dimensioned down the friggin’ micron. Micrometers that are gauged down to ** one ten-thousandth** of an inch? How?
I know of course that they use precision machine tools; but how the heck did they build the machine tools? With an earlier generation of machine tools? But how do you get the precision to begin with? I mean, originally at the dawn of the Industrial Revolution, it must all have started somewhere with blacksmiths and carpenters doing the best they could with handmade tools. How did they manage to eliminate cumulative error and get the tolerences to improve with each new generation of tools?
This is an excellent question!
Standards of precision are generally taken from national (National Bureau of Standards) or global reference standards. These standards have become more precisely defined over time as the technologies to reproduce and reference these standards have improved. In this way machining has become more precise as the instrumentation to measure precision in (for example) the tool and die making process which creates tools had improved.
It’s really sort of a forward moving, circular process and there is no “ur” machine to reference to (a Clovis point perhaps?).
Well, I think you need to make the distinction between precision and repeatability. It’s much easier to make something that does the same thing over and over again (repeatably) than it is to make something which matches something else over and over again (precisely).
Perhaps the greatest invention of the early industrial revolution was the screw-cutting engine lathe, with a lead screw driving the carriage. This machine could make repeatable screws which could be used in other machines.
So how do you make a lead screw with exactly, say, 1/10 inch pitch? That’s very difficult starting without precision machinery. I think, but cannot prove, that development of machine tools drove precision in measurement rather than vice versa. In other words, perhaps the leadscrew of the first screw-cutting engine lathe was made to be some fraction of an inch pitch by approximate methods, but at the time the inch itself wasn’t so precise, and the existance of that leadscrew forced the definition of a more precise inch in terms of that leadscrew.
Yep, you’ve got it. I vaguely recall an episode of James Burke’s “Connections” tv show where he discussed this specifically. I don’t recall the exact details, but the first generation of tools, made by hand with a precsion of about 1/10th of an inch, were used to cut screws with a higher precision. Then they used those screws to drive tools that had an even higher precision. I don’t quite understand how that works, but that’s what Burke said, and he usually has his science right.
Conversely, you’d be surprised just how much clothing- something you would think would have standard sizing- can vary, even within the same article. Clothing manufacturers use punching out to get forms… it’s like trying to cut more than one piece of paper at the same time.