What can the Cloud offer the process industry

19 March 2013

Tim Taberner, of Eurotech, looks at some of the benefits and barriers to adoption of Cloud-based technologies within the process industry.

It is almost impossible for a day to pass without hearing the term ‘the Cloud’. It is so all-encompassing that, unless tightly defined, it is easy to make assumptions and draw false conclusions based upon unrelated paradigms, misunderstand the risks applicable to a specific implementation and fail to grasp the potential value of the technologies emerging from this revolution.

From Wikipedia, ‘Cloud computing’ (or ‘the Cloud’) is termed as the use of computing resources (hardware and software) that are delivered as a service over a network (usually the Internet) typically with some ‘pay per use’ or monthly fee based pricing model. The process control industry has already used the concepts of remote hardware (resource) providing services to multiple users for many years, so what is all the fuss about?

In the past, companies were forced to acquire IT infrastructure to provide services, today, for a monthly fee, it is possible to access the massive investments of Cloud Computing providers to obtain computational power. Some benefits of the cloud services are easy to understand; they can scale massively in a cost-effective and timely manner. The cost of the service is directly proportional to the size of the deployment, meaning that cost-effective, small-scale projects or pilot studies can be run and then simply expanded to enterprise scale. Such benefits can easily be quantified by an organisation, compared to those of installing or expanding their own infrastructure, and a decision made about which is most beneficial.
The real benefits coming from this emerging technology however, while less tangible, are far more profound.

The Internet of things: Machine to machine platforms
The real opportunities to the process industries come from leveraging the underlying technologies and services that the Cloud can power, especially those concerning interconnectivity and interoperability of sensors and systems such as Machine-to-Machine (M2M) platforms. M2M platforms provide all the services needed to easily connect any sensor and system located in the field to Enterprise back-end applications, creating an intelligent infrastructure where devices, machines and subsystems are capable of interacting directly with each other, as well as with operators, enterprise applications and other stakeholders. This application scenario, which encompasses sensors, communications, middleware, software and enterprise applications falls under the umbrella of the so called Internet Of Things (IoT).

M2M Platforms offer a roadmap to overcoming one of the process industries oldest challenges – how to combine the data within separate islands of automation, and more importantly with enterprise and third-party sources, to provide process improvement, cost-reduction and enhanced revenue streams across the entire supply chain.

Historically, control systems have been purchased as point in time solutions, focussed on the need to optimise a particular process. Remote monitoring devices communicate using proprietary, or at best de-facto industry protocols which are not understood in the wider world, thereby limiting their interoperability with other systems. Real-time information rarely extends beyond the confines of the primary control system, and then only with significant restrictions on its external use.

An important concept of IoT based systems is the decoupling of data producers and data consumers, breaking the direct relationship between central and remote systems. Devices and systems publish information that is of potential interest as it happens without knowing who will consume that data. Applications subscribe only to the information they need. As new applications or stakeholders are added, they simply subscribe to the information of interest, with no changes needed to the existing systems.

A trivial parallel would be to think of a group organising a trip to the cinema. Current control systems would do this by phone. One of the group would call each of the others to ask if they want to go, collating the answers. If any of the group wanted to know who else is going prior to making their own decision, they would need to ask the person collating the answers, and this information would not be fully available until everyone had been contacted, potentially meaning repeated calls before the answer was known. The process is inefficient, with potential for some participants to change their mind without informing the group, or be missed because they were busy when called.

In an IoT world, someone would create (publish) a cinema event, each of their friends (subscribers) would see this concurrently and if interested ‘join’ the event. Everyone would inherently know who else was going in real time without additional effort. Anyone busy at the time of the original posting will still see it later and be able to participate. Subsequently, any of the group may change their mind and again this information is immediately available.
Translating this into what might happen in an IoT-enabled control system, consider a motorised valve sitting on a pipeline. This simple device is either opened, closed, in transit, stuck or has a faulty opened/closed indicator.

In a traditional control system, the valve will present this data via a gateway device (such as an RTU or PLC) into the central system, which in turn informs the operators. Transit times for the valve may be measured by the control system and, if outside a limit, an alarm is raised signalling that the valve needs an urgent service visit. The control room operator will then pass this information on to the maintenance team.

In an IoT system, the valve (or a connected gateway) will publish that it has opened, closed, or is in transit. Because multiple stakeholders are able to add applications working above the published data however, we can now do more automatically:

* Separate maintenance applications would monitor the transit time and examine other data from other sensors around the valve to determine if it is stuck (urgent callout) or just has a faulty indicator (non urgent callout) before automatically reporting it to the on-call engineer. Supply chain partners can also automatically be notified of the potential inventory requirement to service the fault, responding with current availabilities.
* A background application might compare transit times of all valves in the plant with historic data to predict pending failures, allowing planned preventative maintenance. Plant designers can monitor the performance of different valve types over time to determine the most effective solution for future use.
* The valve manufacturer might analyse information from all valves across multiple operators, identifying potential systemic failures, and informing new product design.
*  Downstream and upstream stakeholders can see the position of the valve, providing interlocks into their own processes.
* Supply levels of consumables (such as dosing chemicals) can be monitored, compared with forecast demand and automatically replenished from the most favourable supplier when necessary.

The control system is now only concerned with the control of the process itself, simplifying its design and reducing the risk of failures. This all happens within a ubiquitous, scalable, open standards-based architecture, making it easy to add new applications and stakeholders without affecting the existing processes. Crucially, none of the above applications need to be planned at the time the system is deployed; they are simply added as required, subscribing to the necessary source data in order to carry out their process, potentially publishing results back for use by other applications.

This flexibility allows new interdependencies to be identified, optimising the processes around this simple device and finding new ways to operate, providing benefits throughout the entire supply chain. Furthermore, data is available to the wider enterprise, allowing optimisation of processes across all its assets and improving demand forecasting.

Barriers to adoption
So why are process users not adopting IoT-based technologies?
Misperceptions of poor security – Rather like the confident driver who is a nervous passenger, potential users express concerns over the loss of control inherent in public cloud-based solutions, normally verbalised as potential security issues. While the technology can be deployed on a private cloud to overcome this, even for public cloud implementations there is little evidence that security concerns are valid, and much to suggest that they may actually be more secure than privately hosted solutions. Public cloud data centres generally have higher levels of redundancy and also more actively manage security updates than is the case on many private installations. Dedicated virtual machines, with options for secure encryption, restrict access and visibility to an individual basis. The most likely risk to system security, therefore, is exactly the same as for any other type of shared computer access – that is compromised of individual usernames and passwords.
Legacy system integration – A second common misconception is that IoT-based solutions can only be implemented on ‘green field’ installations due to the inability to integrate the existing systems within a plant. In reality, legacy systems can easily be integrated into an IoT architecture by using smart gateway devices providing protocol translation between the legacy system and the enterprise bus of the IoT implementation. These ‘multi-service gateways’ are generic, remotely configurable devices capable of having protocol agents and business logic downloaded as required by the needs of an individual installation.
Project led –v- holistic view – The main barrier to adoption, however, is not technical at all, but relates to the way projects are planned and funded. Key benefits of an IoT approach are realised when systems are interlinked, allowing relationships to be created and inspected between previously unconnected data. Conversely, most process projects are concerned with point in time solutions to control a particular set of plant equipment. Little or no consideration is given to future proofing the design and enabling business and process improvement outside of the scope of the system being directly controlled.

Although deploying Cloud can often be argued simply from this future-proof nature of the data agnostic, communications network-independent architecture of IoT technologies, deploying an IoT solution needs vision. The understanding that the ability to combine data from the system in a flexible, scaleable way with that from other sources, and making this data available to the wider enterprise, supply and stakeholder chains will inevitably lead to process improvements in the widest sense, as well as opening up possibilities for new potential revenue streams.

There is little doubt that the arrival of the Internet of Things has had the same profound effect on M2M paradigms as social networking and Web 3.0 have had on the way human society interacts. New ways of extracting and combining data from disparate sources will enable a process system to interact directly with the entire supply chain, stakeholders and customers as part of a comprehensive business optimisation process. Early adopters are undoubtedly gaining competitive advantage over slower entrants, and the only real question to be answered by process professionals is not if, but when, do they adopt these emerging technologies in order to join the revolution.

For more information visit http://www.eurotech.com/en/products/software+services

Contact Details and Archive...

Print this page | E-mail this page