<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=736666&amp;fmt=gif">

A New Paradigm for Calibration Intervals

Posted by Grady Keeton on Apr 2, 2019 3:42:06 PM

Calibration is not an option. While today's instruments are more accurate and drift less than previous generations, you still need to periodically check and calibrate your equipment. The way you do this, however, is changing.

In the past, manufacturers would recommend calibration intervals. A digital multimeter manufacturer might, for example, recommend that you calibrate the instrument once a year. When that year was up, you sent the DMM to your company's cal lab or to a third party.

That paradigm is changing. The current trend in metrology is not to blindly follow a manufacturer’s stated calibration interval, but to determine your own interval based on how much each instrument drifts over time and how much risk you're willing to take. 

We have some customers that calibrate our instruments at the beginning of each shift. These customers are producing a high-value product and don't want to take the risk of producing a faulty unit or system because their test equipment is out of spec. Other customers only calibrate once per year. These customers have done their homework and determined that the risk of using test equipment that is out of calibration is low compared to the cost of calibration. Usually, they have the characterization data to support this approach. Most customers fall between these extremes.

One thing to keep in mind is that the calibration interval can change over time. Some instruments may drift more than others as they age, and you will have to calibrate these instruments more often. Also note that two units of a particular model manufactured at the same time and used under the same conditions can and do exhibit differing rates of drift. The calibration interval would then be different for these two units.

Another thing that can happen is that you may use an instrument in different applications over its lifetime, and the applications can affect how often you have to calibrate the instrument. For example, you may use a power supply in an environment that's hotter than its previous application, and this may cause the supply's calibration to shift more than it did in the first application. By closely monitoring the performance of the instrument and taking into account how much risk you're willing to take, you'll be able to set the proper calibration interval.

Several industry and standards organizations support this approach. ISO/IEC 17025:2005, General requirements for the competence of testing and calibration laboratories, for example, states that it is the responsibility of the end-user organization to determine the appropriate calibration interval under the requirements of its own quality system.

The National Institute of Standards and Technology (NIST) has a web page that describes its recommendations on how to establish calibration intervals. It says that, specific recalibration intervals depend on a number of factors including:

  • Accuracy requirements set by customers
  • Requirements set by contract or regulation
  • Inherent stability of the specific instrument or device
  • Environmental factors that may affect the stability

Finally, NCSL International, a professional association for metrologists, publishes RP-1, Establishment and Adjustment of Calibration Intervals, which “provides a guide for the establishment and adjustment of calibration intervals for equipment subject to periodic calibration. It provides information needed to design, implement and manage calibration interval determination, adjustment and evaluation programs.” The NCSL website has a number of other publications on this topic to help you set appropriate calibration levels.

Topics: calibration, metrology, Calibration intervals

Subscribe Here!

Posts by Tag

See all

Recent Posts