This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

How often should instruments be calibrated?

19 January 2015

Plant efficiency can be improved and costs reduced by performing calibration history trend analysis. It can help to define which instruments can be calibrated less frequently and which should be calibrated more frequently. Calibration history trend analysis is only possible with calibration software that provides this functionality, says Beamex. 

Manufacturing plants require instruments that perform and measure to specified tolerances. If sensors drift out of their specification range, the consequences can be disastrous, resulting in costly production downtime, safety issues or the production of inferior product batches which then have to be scrapped.

Most process plants will have a maintenance plan or schedule in place to ensure that all instruments are calibrated at the appropriate times. However, with increasing demands and cost issues being placed on manufacturers, the time and resources required to carry out these calibration checks are often scarce which can lead to instruments being prioritised for calibration, with those deemed critical enough receiving the required regular checks, but others – deemed less critical to production – being calibrated less frequently or not at all. 

However, plants can improve their efficiency and reduce costs by using calibration ‘history trend analysis’ – a function available within Beamex CMX calibration software. With this function, the plant can analyse whether it should increase or decrease the calibration frequency for all its instruments.

Cost savings can be achieved in several ways – by calibrating less frequently where instruments appear to be highly stable according to their calibration history; and by calibrating instruments more often when they are located in critical areas of the plant, ensuring that instruments are checked and corrected before they drift out of tolerance. This is common practice for companies that employ an effective ‘preventive maintenance’ regime. The analysis of historical trends and how a pressure sensor, for example, drifts in and out of tolerance over a given time period, is only possible with calibration software that provides this type of functionality.

Current practices
How often do process plants, in reality, calibrate their instruments and how does a maintenance manager or engineer know how often to calibrate a particular sensor? 

Beamex conducted a survey that asked process manufacturing companies how many instruments in their plant required calibrating and the frequency with which these instruments had to be calibrated. It covered all industry sectors, including pharmaceuticals, chemicals, power and energy, manufacturing, service, food and beverage, oil and gas, paper and pulp. Interestingly, the survey showed that from all industry sectors, 56% of the respondents said they calibrated their instruments no more than once a year. However, in the pharmaceutical sector, 59% calibrated once a year and 30% calibrated twice a year.

The study also proved that the pharmaceuticals sector typically possesses a significantly higher number of instruments per plant that require calibrating. In addition, these plants also calibrate their instruments more frequently than other industry sectors.

Analysing calibration history trends
Regardless of the industry sector, analysing an instrument’s drift over time (the historical trend) can reduce costs and improve efficiencies.

Pertti Mäki, area sales manager at Beamex, explains: “The largest savings from using the History Trend Option are found in the pharmaceuticals sector, but all industry sectors can benefit from using the software tool, which helps identify the optimal calibration intervals for instruments.” 

The trick is to determine which sensors should be re-calibrated after a few days, weeks, or even years of operation and which can be left for longer periods, without sacrificing the quality of the product or process or the safety of the plant and employees. Doing this enables maintenance staff to concentrate their efforts only where they are needed.

There are other, less obvious benefits of looking at the historical drift of a sensor or set of instruments over time. Mäki explains: “When an engineer buys a particular sensor, the supplier provides a technical specification that includes details about its maximum drift over a given time period. With CMX’s History Trend Option, the engineer can verify that the sensor actually performed within the specified tolerance over a certain time period. If it hasn’t, the engineer now has data to present to the supplier to support these findings.”

The History Trend function also allows the quality or performance of different sensors from multiple manufacturers to be compared in a given location or set of process conditions. 

CMX Calibration software can also help with the planning of calibration operations. Calibration schedules take into account the accuracy required for a particular sensor and the length of time during which it has previously been able to maintain that degree of accuracy. Sensors that are found to be highly stable do not need to be re-calibrated as often as sensors that tend to drift.

The History Trend function enables users to plan the optimal calibration intervals for their instruments. Once implemented, maintenance personnel, for example, can analyse an instrument’s drift over a set time period. History Trend can display drift both numerically and graphically. Based on this information, it is then possible to make decisions regarding the optimal calibration interval and the quality of the instruments with respect to measurement performance.

The ‘History Trend’ window enables users to view key figures of several calibration events simultaneously, allowing for evaluation of the calibrations of a position or a device for a longer time period compared to the normal calibration result view. For example, the user can get an overview of how a particular device drifts between calibrations and also whether the drift increases with time. The engineer can also analyse how different devices are suited for use in a particular area of the plant or process.

Reporting is straightforward and the reports can be tailored to suit individual needs, using the ‘Report Design’ tool option. Calibration frequency can be decreased if the instrument has performed to specification and the drift has been insignificant compared to its specified tolerance, or if the instrument is deemed to be non-critical or in a low priority location. 

Calibration frequency should be increased if the sensor has drifted outside of its specified tolerances during a given time period or if the sensor is located in a critical process or area of the plant and has drifted significantly compared to its specified tolerance over a given time period. Other reasons for increasing calibration frequency include when measuring a sensor that is located in an area of the plant that has high economic importance for the plant; where costly production downtime may occur as a result of a ‘faulty’ sensor; or where a false measurement from a sensor could lead to inferior quality batches or a safety issue.


Contact Details and Archive...

Related Articles...

Most Viewed Articles...

Print this page | E-mail this page