ndc is totally unrelated to tolerance. It is based 100% on the process variation, which is what it is, and should never be artificially inflated as Bob suggests. You should make certain that the batches represent your full process variation, no more, no less.
That is an
ideal situation, but can be economic or physically unfeasible depending on the process (such as getting the many lots of raw materials you may have over the life of the process to generate the variation). The process variation that can be presented to a gage R&R study - during the time a gage R&R needs to be prepared - can rarely represent the entire process variation over the life of the process. It only can represent what samples you have at the time - which should be considered a snapshot in time. You can hope there is enough variation in the samples that the gage and operator measuring
system can detect the difference between them. That
is key.
OK, the true calculation of ndc does not include the tolerance. But, let's look at what the calculation is trying to say-after all the relationship it is trying to describe is still the most important to understand. I'd never say never - there may be something to be learned, for example as in an ndc of 1 when calculated with the tolerance is a
guaranteed no win situation. - no need to go any further. But, perhaps a better surrogate of the PV would be (UCL-LCL). As a matter of fact, when using (UCL-LCL) instead of PV the resulting value should equal or exceed 10 for SPC. Not comfortable calling it ndc because it does not precisely match the ndc calculation? OK, fine...we can give it another name, ndc
cl or something. PV in a gage r&r is pretty weak - surely does not rise to the level of confidence as the capability study, etc. It's a small nail - don't hang too big of a hat on it. 