santicor opened this issue on Sep 03, 2010 · 70 posts
PrecisionXXX posted Sun, 29 March 2015 at 7:45 PM
"For example, you are producing ball point pens with a ball outer diameter of 0.35 mm. Acceptable balls range from 0.34 to 0.36 mm. Subtracting the maximum and minimum diameters, the process tolerance is 0.02. In order to accurately measure these balls, you must use a gage capable of detecting several differences in this range. Using the Rule of Tens, calculate 10% of the process tolerance. This means your gage must be able to detect a difference of at least 0.002 mm between balls."
So in other words you used to be really good at measuring balls. Lol... sorry couldn't resist. :D
(Runs out the door)
.002mm or 2 microns is a part tolerance. The gage tolerances we had were far less than that. Our standard were considerably tighter. As 2 microns is two millionths of a meter, maybe suitable to some products, we kept our gaging, at least the calibration standards, usually around 20 millionths of an inch, taking in the instrument we used as having a known error,( not a problem, it's known) and an uncertainty that was sometimes greater than the gage tolerance, in which case we used monochromatic light compared to standards that were held to grade .5, or half a millionth of an inch, With part tolerances getting tighter and tighter, the fun of the job disappeared fast, I wasn't unhappy to retire at 55 with 30 years of service. Some of the guys I worked with are over seventy now, still working, with 50 or more years. They'll have nice pensions if they don't die the day they retire. Seen that too. Almost forgot, in the case of a ball that goes in something, a proper tolerance would be noted as .36 mm to ,34 mm, not as a plus or minus .01. Gaging would follow,the larger size, the error would have to be toward the smaller, and the smaller would have to be toward the bigger. It keeps the known error within the part tolerance.
Doric
The "I" in Doric is Silent.