Calibration Frequency for Rarely Used Gages

D

deniser

Cindy,

He's not a new engineer. He's been around a while. It is my opinion that our system will self regulate itself and find the right maximum for the equipment. He is concerned that the interval might stretch beyond 5 years. In some cases it might, but that equipment would also come with a 20 year history of good calibrations without an adjustment needed. If it fails, we halve the interval and hold at that level for at least 3 good cycles before starting to increase the interval again. I believe we'll know when we reach the max for that unit.
 
R

Ryan Wilde

If I were your customer, I would actually worry as well, unless you answered a few questions. Long intervals work, especially on extremely stable gauges that receive no abuse. The problem is that with a 5+ year interval, the abuse could happen, and there could be years that the gauge is erroneous.

My questions would be:

  • Are the gauges checked in-process? (i.e., a quick check every month to give a level of confidence)

  • Are there records that show that your interval computation is statistically sound?

  • Is the new extended interval set to recalibrate just BEFORE the gauge goes out of specification, or just AFTER the gauge goes out of specification?

Sorry, I just a needed to play a bit of the devil's advocate on a Friday before I go home. It's been a long week of employee performance reviews, and I needed a technical break to ease my mind.

Ryan
 
D

deniser

Ryan,

No problem playing devil's advocate. I'm trying to find out if we're really missing something by not setting an upper limit.

The gauges that adjust their intervals are not mechanical or dimensional. Damage to them would be apparent. They're items like pulse generators, power supplies and other electrical output units. The output is often read by another unit. They're all units that we've had 15-20 years. A lot of the users were on the teams that developed these units and they know immediately when the unit has a problem. All are in engineering, not in production.

Our intervals are not exactly in compliance with NCSL's methods since we do not have an in house calibration facility and must rely on subcontractors to come onsite twice yearly, though we've tried to stay as true to the method as we can.

There is a practical limit to the cal interval. People tend to get nervous at the 4-5 year cycle and decide the risk isn't worth the potential cost savings and take the units off auto-adjust. Our longest interval is currently 5 years, and those units have been in the closet for 2 of those 5 years.

We just had our onsite. They did 196 units. 2 failed. One was ESD zapped by the calibrator. The other was broken and needed a new jumper. Nothing needed adjusting. I can't recall a time where we've had more than 1% OOT's.

I suppose we could do something extremely silly like put an upper limit of 20 years, but I'd rather not set a limit unless it has meaning.
 
C

CINDY

deniser,

Can't say as I blame you. We have a lot of calibration certificates for equipment that has never been out of calibration plus we perform pre/post test through the whole chain for any test. We are certain that all equipment used is functioning properly and we have the pre/post verifications. We try to lower our required calibrations as well due to the cost of some of the calibrations. But honestly, anything over 5 years is cause for concern.

Can you prove that the equipment is verified and can you provide evidence?

And on another note, if the customer is requesting this to ensure correctness and the customer is knowledgeable, I probably would do the calibration just to give the customer that warm fuzzy feeling and keep them happy. Maybe in time with support of data the customer can see your point.

Cindy
 

Hershal

Metrologist-Auditor
Trusted Information Resource
Wow! This is a great thread!

For the general use of folks with lots of gages that are primarily micrometers, calipers, dial gages and so forth......look at your actual use and environment. If the use is daily, and it is a machine shop or test lab environment then more frequent calibration than annual might be appropriate. If in a well-controlled environment, rarely used, stored carefully, then a longer cycle is certainly appropriate (with a few exceptions).

Beginning with annual is the typical starting point. Adjust from there based on calibration history and preferably by using NCSLI RP-1 as a guide. Your accredited third party calibration provider should be able to advise you, based on history of the item, and the use and environment it is in. That of course will depend somewhat on the calibration provider having the history of the item.

Also, remember that in some cases, changing a battery/fuse is considered a repair and therefore gets a repair charge from the calibration provider. This is the case when batteries and/or fuses are built in to the unit such that the same opening must be made to adjust or change a battery/fuse. Generally, this is electronic items, not calipers. Ask your calibration provider about that before having them change the battery/fuse, and ask how much the repair charge is. I have seen as high as $120 per hour for repair, and changing a battery can take as much as 15-20 minutes depending on unit. For fuses, a solder-in fuse is actually a repair, but a snap-in fuse is a consummable.

Hope I haven't scared too many folks......

Hershal
 
G

Graeme

Where's your comfort zone?

deniser said:
We have a customer that says we need to set a top end of our interval adjustments. We're using a modified NCSL RP-1 adjustment frequency, with the multipliers adjusted to match our onsite calibration frequency. Has anyone else ever been required to set an upper limit on a quasi-statistically based frequency?
The cal lab I am working with is the in-house electronic calibration facility for a major airline. We have decided on these rules and written them into the appropriate quality policy:
  • Calibration interval analyses are based on methods in NSCL RP-1 (usually method A3) at 95% end-of-period reliablity and 95% confidence level. Interval adjustments are done on a model number basis where possible -- all Fluke 87 meters, for example
  • If the data supports an increase in the interval, we limit the maximum increase to 50% of the current interval. (for example, if a meter is currently calibrated every 12 months it would be extended to no more than 18 even if the data supports more than that.)
  • The maximum calibration interval allowed is 60 months (5 years).
The limits on amount of increase and on maximum interval are deliberately conservative because this industry is intensely safety-conscious. Making a major change like doubling or tripling the current interval is much too far outside our comfort zone.

A few general obervations:
It takes much less data to reduce an interval than to increase one. For example, an item had 2 repeated OOT events at a 3 month interval so it was reduced to 1 month. There is very little likelihood that it would pass the next 40 consecutive calibrations - which is what would be required to maintain the 95% reliability. (That model is also going to be replaced.)

Recognize that where there are fewer than 10 units of a model number in the company, it is unlikely for them to ever have their interval increased by using any of the easily-implemented methods in RP-1. This is because the time required to gather data for a statistically valid analysis exceeds the probable useful lifetime of the equipment.

In about 20 years in several different industries I have never seen an interval longer than 72 months -- and that was for totally passive items like waveguide directional couplers.
 
G

Graeme

Rob Nix said:
Yes, a recommendation is exactly that - a recommendation. It is not a mandate.
Except ... if you are in a regulated industry (medical, nuclear, aircraft maintenance, etc.) then the regulatory agency probably has a few things to say about it.

The FAA, for example, requires first recording actual data (not pass/fail) for every calibration; and statistical analysis of the data history to determine the interval. If you are referring to NCSL RP-1 for methods, the statistical requirement means that you cannot use methods A1 or A2 -- which happen to be the two most widely used methods.
 
G

Graeme

CarolX said:
Wes,
You are correct....and for as long as I have been in the quality field..the term "calibrate" has actually meant "to check". Calibration is the adjustment you make for an out of tolerance condition. But to make things as easy as possible in the worlds we work in...I think we all refer to the function as "Calibration".
Right or wrong....IMHO it is just semantics. As long as the message is understood.

CarolX
Short definition: calibration is verification of the instrument's performance with respect to its performance specifications, by using external calibrated and traceable measurement standards of known value and uncertainty.

Long form of the definition: First, all of the above. Then, if indicated by the results of the initial calibration run, adjustment or repair to restore proper operation. Finally, another calibration run to verify that the performance meets specifications.

The basic premise of a proper calibration procedure is that the instrument is in good working order and only needs to have its performance verified (NCSL RP-4). The calibration is performed with the instrument in its normal operating configuration.

The problem with the "calibration" procedures in most manufacturer's manuals it that they start by assuming that the instrument always needs adjustment. In general, adjustment is only required if the calibration (performance verification) either fails or a result is in the area of a specification limit +/- the uncertainty of the measurement standard. Adjustment may be performed in other cases where the result is within specification, but with cautious awareness of the risks of tampering with a process.
 
B

blemon

I was scanning the board for an answer to a similar question...we have a Fowler Optical Comparator Set which might be used once a year, maybe twice. I was told by an engineer at the OEM that the shift over time in accuracy is negligible. Basically, what I am looking for is substantiation for extending the interval out as far as feasibly possible. If I can justify it on paper, management, auditors, and integrity will be satisfied.

Any help will be greatly appreciated!
 

Al Rosen

Leader
Super Moderator
blemon said:
I was scanning the board for an answer to a similar question...we have a Fowler Optical Comparator Set which might be used once a year, maybe twice. I was told by an engineer at the OEM that the shift over time in accuracy is negligible. Basically, what I am looking for is substantiation for extending the interval out as far as feasibly possible. If I can justify it on paper, management, auditors, and integrity will be satisfied.

Any help will be greatly appreciated!

Use the history of the comparator to justify the calibration interval. If the device is always within tolerance when received for calibration, extend the calibration interval. You can also keep track of the no. of times used and set an interval based on that. You might also want the OEM to put in writing the accuracy shift over time.
 
Top Bottom