A new approach to flow meter calibration

30 August 2016

Neil Bowman suggests some alternatives to traditional set-time based calibration strategies.

The oil and gas sector has benefited greatly from recent technological advances in flow measurement technology. 

Flow measurement was traditionally performed by mature technologies such as orifice plate meters, turbine meters and positive displacement type devices. However, over the last two decades these have gradually been replaced by newer technologies such as ultrasonic, electromagnetic and Coriolis meters which are, generally, non-intrusive and have much higher turn-down ratios, so are able to measure a much larger range of flows to the required accuracy. These changes have been accompanied by rapid developments in sensing technology, process monitoring and the introduction of meter diagnostics. 

However, diagnostics alone are not enough to ensure that the required accuracy of the device is met, which means that periodic calibration remains an essential part of the flow measurement system maintenance. The issue facing operators now is that the cost of system shutdown makes calibrations an expensive and time-consuming undertaking. 

Meter calibrations are traditionally performed based on a specified time interval. Factors such as instrument type, operating conditions, measurement application and the manufacturer’s recommendation are just some factors that need to be considered when selecting a calibration interval. 

Set time-based intervals
One issue with the current system of performing calibrations based on a set time-based interval is that this does not necessarily take any account of the conditions to which the meter has been subjected, or whether it has suffered a statistically significant degree of calibration drift, that is likely to impinge on measurement accuracy, or the resulting financial exposure such inaccuracies create for operators. 

This means that the operator could be performing calibrations with unnecessary frequency, incurring all the associated costs and impacts on operational efficiency. Conversely, the meter may be drifting at a higher rate than anticipated, resulting in increased financial exposure. However, without any form of diagnostics or condition monitoring in place, the operator does not know if either of these situations arise. 

There are two other calibration scheduling methodologies available that operators should be looking at to harness the benefits of modern technological developments. The first is risk-based calibration, where calibration scheduling is based on the degree of financial exposure caused by calibration drift over time, weighed against the cost of calibrating and otherwise maintaining the device for a given calibration interval. 

The second alternative approach is condition-based calibration, which involves the use of diagnostic data acquired from the device or measurement system either through post-processing of the primary measurement data or as secondary data that can give qualitative insight into the health of the measurement system and indicate anomalies in the performance of that device or system. 

Enlightened calibration
Attitudes within industry are beginning to shift towards more enlightened methodologies, despite the current popularity of ‘time-based’ calibration scheduling. In the UK, the Oil and Gas Authority (OGA), has updated its guidelines to support risk-based and condition-based calibration.

In principle, however, the ideal calibration strategy would be a combination of these approaches where qualitative ‘condition-based’ diagnostic data is used in conjunction with statistical modelling based on data from historical calibrations to drive up efficiency, reduce costs and maintain accuracy. 

Despite the benefits of using one of these modern approaches to the calibration scheduling, many operators still use time-based scheduling. Factors such as the simplicity of time-based scheduling, lack of training, outmoded apparatus, lack of budget and systemic inertia may all be factors in why some operators have not yet embraced this new mindset. 

However, the financial facts of oil and gas flow measurement make clear the importance of being able to achieve a high degree of measurement accuracy. With an oil price at the beginning of 2016, of around $40 per barrel, and with daily global sales in the region of 100 million barrels, this would raise $4 billion revenue per day. If we accept that the uncertainty in fiscal measurement for liquid was ± 0.25%, the resulting daily financial exposure would be about $10 million. This equates to an exposure of $3.6 billion per year, or the equivalent of nearly a day’s production. The significant fiscal and custody transfer implications of inaccuracy and not adapting to a new meter calibration approach can clearly be seen.

Neil Bowman is a project engineer at NEL, part of the TÜV SÜD Group.


Contact Details and Archive...

Print this page | E-mail this page