Hey all. We just received a new requirement from one of our customers for MSA studies where it says we have to do the following:
“To help assess the gauge, the organization shall report the value of +/- 2 Total Gauge R&R
Standard Deviations to understand the 95% prediction interval (uncertainty) of any one
measurement. This value can be used in conjunction with engineering judgment to help assess the distance between the edge(s) of the process distribution and the specification limit(s). The organization shall report gauge R&R as both a percent of study variation and a percent of
tolerance.”
To start, we used to just run a 3x3x10 ANOVA study for this customer and try to land under 10%. Now the change is they added the “+/- 2 Total Gauge R&R Standard Deviations” line as the change and I am not 100% sure what that entails.
Also for things like runout with one sided tolerance, it says to calculate by doing 6 total R&R standard deviations divided by the USL - lower boundary of zero. Does anyone have any insight on this?
“To help assess the gauge, the organization shall report the value of +/- 2 Total Gauge R&R
Standard Deviations to understand the 95% prediction interval (uncertainty) of any one
measurement. This value can be used in conjunction with engineering judgment to help assess the distance between the edge(s) of the process distribution and the specification limit(s). The organization shall report gauge R&R as both a percent of study variation and a percent of
tolerance.”
To start, we used to just run a 3x3x10 ANOVA study for this customer and try to land under 10%. Now the change is they added the “+/- 2 Total Gauge R&R Standard Deviations” line as the change and I am not 100% sure what that entails.
Also for things like runout with one sided tolerance, it says to calculate by doing 6 total R&R standard deviations divided by the USL - lower boundary of zero. Does anyone have any insight on this?