Let’s say the print tolerance for the length of a widget is 6.660 inches to 6.665 inches. A micrometer with a resolution of 0.0001 inches measures the length of a widget. It is 6.6652 inches. Is the widget out-of-spec?
I would say it is within tolerance because the tolerance numbers show 4 significant digits. The micrometer measurement rounded to 4 significant digits is the upper end of the tolerance.
If your tolerance range had been expressed as 6.6600 - 6.6650 then that measurement would be out of tolerance.
This also assumes that your micrometer is actually accurate to 0.0001 inches. Just because that’s the reading doesn’t mean it’s accurate.
I’d say it’s within tolerance. If the desired length is 6.66 and you tolerate +/- .005 you are saying that it goes out of tolerance one it hits 6.666.
I would argue rather that the fact that the tolerance shows only 4 significant digits is an indication that the part isn’t in spec. If whoever wrote the spec intended for for 6.6652 to be acceptable, they would have written the spec to greater precision so as to be able to express that. And even a measurement exactly on the edge of the stated interval is already iffy.
You can argue that 6.6652 is outside of the acceptable range, and you can maybe argue that it’s on the edge of the acceptable range, but you can’t argue that it’s within the acceptable range.
If the part has to fit in a 6.665 opening in the finished product, then a 6.6652 part won’t fit, and is out of spec. If there’s more room, your widget is still out of spec, but the customer will be able to use it. The importance of tolerances sometimes comes down to practicality.
If they intended 6.6652 to be unacceptable, then they should have written the spec to 6.6650. They should expect the final digit in an actual measurement to be the result of rounding.
Let’s look at it another way. Suppose your micrometer only read to 0.001 and you measure the same part. It would have shown 6.665 and the part would be in spec.
- Specifying the tolerance independenly like ‘+/-.0001’ is the best way to do it if it actually matters.*
- The last digit of precision is assumed to be rounded.
- You don’t measure to 1/10000 precision with a tool that has 1/100000 precision without determining ahead of time how it will be interpreted.**
*Sometimes specs are merely suggestions just like speed limits.
** Except maybe the first time you encounter the problem.
By that argument, the worse the instruments you use, the more likely a part is to be within spec, and you could get all of your parts to be in spec just by using really terrible instruments.
If the error range of your instrument extends beyond the edge of the acceptable measurement of the widget, then the best you can say is that you don’t know whether the widget is in spec. How much precision you need out of your instrument to be sure the widget is in spec depends on what the measurement is, and how close it is to the boundaries. A micrometer with a resolution of 0.001 inches is just fine, if the part measures in at 6.662 inches. But if the part measures in at 6.660, then we don’t know from that instrument if it’s in spec or not: We’d need to use a finer instrument to see which side of the line it’s on.
Not really. The instrument in my example is just as good as the spec. Suppose I had a micrometer that measured to 0.0000001 and the measurement was 6.6650002. Is it out of spec?
Is 6.6650002 < 6.665?
Yes, for large values of 6.665 and small values of 6.6650002.
If I was the one who’s job it was to determine pass fail that part would fail. I’d interpret the spec to be 6.6625 +/- 0.0025 which is why you’re measuring to 0.0001.
A lot depends on how the specification was determined. I worked in manufacturing all my life, while not entirely with QA/QC (Quality Assurance/Quality Control), I was associated with QA/QC functions all that time. Not surprisingly, this question came up a lot. Usually when third-party inspectors were involved (third-party inspectors are people hired by the buyer who do not know how to do anything, but feel obligated to tell others how to do their job. Kinda like Republican Poll Watchers).
Anyway, if the result was found by an in-house inspector who, for whatever reason, decided to reject a part for measuring 6.6652 when the spec was written as 6.660 - 6.665, it would almost universally be accepted by the inspector’s supervisor, with the inspector being given more training on how to do their job. As mentioned above, if the intent of the specification was that 6.6652 was unacceptable, then the upper limit would have been expressed as 6.6650 (or even greater precision). If there was a third-part inspector involved, politics would get involved, and who knows how it would play out, but if the result would significantly affect either price or delivery schedules, it would likely be accepted.
True story. TLDR; one time I convinced an international engineering standards board that a measurement tolerance of +/- 0.031" actually met the +/- 0.03" requirement stated in the standard,
At one time, a global standards organization revised some decades-old standards, including metrification of dimensions that had traditionally been expressed in inches to millimeters (while keeping the inches equivalent as part of the standard). In many of the tables in the standard I was dealing with, several dimensions had tolerances of +/- 1/32" (decimal equivlent 0.03125"). Now, in general, a dimension given in (inches and) fractional inches in intended to be measured with a scale (ruler), and a tolerance of +/- 1/32" was a way of saying the dimension should match the tabulated dimension when measured to the nearest 1/32". Now, machinists typically no longer use a scale to measure dimensions, as digital calipers and micrometers are so ubiquitous, so if the tabulated dimension for a part was 6 5/8" +/- 1/32, or an acceptable range of 6.59375 - 6.65625. If the machinist was using digital calipers, that would be rounded to 6.594-6.656, and anything outside that range would be rejected. With modern CNC machining equipment, that tolerance would be considered wide enough to back a bus through, so differences between the measured dimension and the rounded tolerance was never an issue on the factory floor.
Now, with the metrification of the standard, the rules for dimensions, either inches or millimeters, was decimal only, no non-decimal fractions. In addition, dimensions with three decimal expression in inches were converted to millimeters and rounded to two decimal places. If it was a two decimal inch measurement, it would be given a one decimal place millimeter equivalent. When they got to +/- 1/32", it was decided that a metric equivalent of +/- 0.8 mm the best metric equivalent (the inch equivalent is 0.03150", nearly indistinguishable from 0.03125"). Because the millimeter dimension had one decimal place, the inch equivalent was given two places, or 0.03 inches.
Here is where I got involved. We had an auditor come through the manufacturing facility and review our drawings to the new standard. We had literally hundreds of parts drawings that called out the +/- 0.031" tolerance. The auditor flagged this discrepancy in our drawings, citing that they did not conform to the international standard, and advised that our facility have it license to manufacture parts to meet the standard revoked. Of course, the economic impact of no longer being a licensed manufacturer, as well as the time and cost of revising hundreds upon hundreds of drawings to correct such a minor discrepancy that had no effect on the product to meet the performance expectations of the standard was rather onerous. So I wrote a letter to the international standards board essentially stating that they should consider +/- 0.031" to meet the requirement of +/- 0.03" written into the standard, including the above explanation of how this had developed. They agreed, we kept our certification, and eventually, the standard was revised so that +/- 0.031" was called out instead of 0.03".
[Moderating]
This isn’t a Warning, because it was only five words out of a long, informative, and on-topic post. But that interjection of politics in GQ was inappropriate. Don’t do it again.
Sorry.
[Not moderating]
Do I understand correctly that the acceptable tolerance was expressed twice, in different units? And so a precise instrument (which can be set to read out in either units) could in principle give a measurement smaller than the stated metric limit, but larger than the stated American-unit limit? That seems like a poor system, and that a better one would use only a single system of units (preferably metric), or at least make only one system governing, with the other listed parenthetically only for clarification.
You are correct. At the time, there was a big push for the standard to contain metric units. Because the nominal (standard) sizes covered by the standard did not have any direct metric equivalents, and the design, design verification, and design validation had all been based on the feet-inch-pound units, these did not translate cleanly into metric. One option was to totally redefine the standard in metric units, but that was rejected because the cost could not be justified by the benefits. Even the proponents of metrification did not want to deal with that. But, because of exactly the issue you bring up, the standard did state the while both US Customary and metric units were listed in the standard, parts designed and manufactured using US Customary units were to be inspected using US Customary units (or equivalents). If they were designed and manufactured using metric units, then they were to be inspected using metric units (or equivalents). Yes, this could technically end up with product that, if produced to the US Customary units would conform to the international standard, while the same part produced to the metric units would not. That would be a very small populations of product, however, and the degree to which they did not conform to the standard would be very small and considered not enough to affect performance.
Consider a requirement that, for an amusement park ride, the rule was that everyone admitted on the ride must be at least 4 feet tall (1.2192 meters). The same ride, in a country using the metric system might put the limit at 1.2 meters as the minimum height. If the difference is negligible (0.75 inches) to the design of the safety features, both could be considered suitable limits for the ride. Neither limit would allow someone who is only 3 foot 8 inches tall to take the ride. In addition, since the practical effect of such a height restriction on amusement park rides is to keep children less than 6 or 7 years old from riding (since young children may not follow instructions), either height limit works, and certified birth certificates are not required.
IANAmachinist, but if there’s a fit problem, isn’t the tolerance all one way? Like 1.250" +0.0/-.005 for the peg while the hole would be 1.260" +.005/-0.0 which would yield a .020 sloppy fit in a worst case scenario instead of the intended .010.
I found an online guide that says that with steel parts and a snug fit, a 1.250" diameter tolerates a 1.2500 – 1.2510 hole and a 1.2490–1.2498" shaft, the clearance being 0.0002 – 0.0020". You would have a hard time fitting them together if the shaft were wider than the hole.
Sloppy fit clearance is 0.0015"-0.0067" (ie a 1.2500-1.2535 hole, 1.2468-1.2485 shaft), NB still not negative
In the OP, the tolerance was given as “6.660 inches to 6.665 inches.” The tolerance goes both ways. Depending on the fasteners used, a part smaller than 6.660 inches might not stay put. A little too big, or a little too small, and the customer might be grumpy, but he might be able to use it rather than waiting for a replacement that’s the right size. In real-life manufacturing, sometimes you’ll hear, “We’ll buy it today, but don’t let it happen again.”