Please clarify the Rule of 10 to 1 - AND - What is the ndc number?

A

Allattar

I would have to think a lot on that one, and afraid I may not be clever enough to come up with such a test :).

We do have a number the NDC, we have a goal > 10, we also have a number of samples, and operators. Hence we have values for degrees of freedom. Some kind of one sided test comes to mind.

I am aware the last comment was not aimed at me directly though :)
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
I am aware the last comment was not aimed at me directly though :)

True. Those that use Wheeler's approach to gage decision also need to consider that they are achieving 10 statististically discreet categories within the control limits. So, yes...any way you get there, you need to get there! :tg:
 
Last edited by a moderator:
J

jee2006

The "10:1 rule" is a guideline in metrology. Broadly speaking, your measuring instrument chosen should be accurate (not just discriminate) to 1/10th of the tolerance.

In other words, if you have a feature with a tolerance of 0.010", your measuring instrument should be accurate to no less than 0.001".


Unfortunately, I have no idea what 'ndc' is, out of context. What book are you using? There may be some of us here who have the same one, or by the same author, and can extrapolate the meaning from that information.


:applause:

I liked the good answer you gave on the question. But to make it further clear, will you explain it with a numeric example. Say the measurement equipment has the least count of 10 and the specification is ±50. In this case what would be the P/T calculation?
Thanks, Regards
jee2006
 
D

DonDowns

I ran into this debate in my shop not only between manufacturing and inspection, but also with customer source inspectors. I resolved it by using ASTM E 29, which ended the debate. I hope this helps.
ASTM E 29 Standard Practice for Using Significant Digits in Test Data to Determine Conformance with Specifications.
 
A

arunk372

Thanks Ron, thanks Tim, and above all thanks geees, who asked this question.
 

Hershal

Metrologist-Auditor
Trusted Information Resource
Actually, the 10:1 rule has changed since its original debut, just like the 4:1 rule. Now, one needs to know whether the discussion is TAR or TUR, and they are different.

The original, going back to the old MIL-STD-45662 and 45662A, uses a TAR, or test accuracy ratio. That is, the standard would be ten times more accurate than the item it is used to calibrate.

However, the TUR, test uncertainty ratio, uses the uncertainty of the standard and compares it to the tolerance of the item being calibrated. So, for the 10:1, the standard's uncertainty is only 1/10 of the tolerance for that measurement of the item being calibrated.

That can be a challenge at times as a standard may be able to deliver the 10:1 or 4:1 TUR on most points, but maybe not all.
 
Top Bottom