Is the PLC losing its relevance today?

09 May 2023

Suzanne Gill finds out how the role of the traditional PLC is evolving with the advent of edge computing.

To understand how the role of traditional Programmable Logic Controller (PLC) is evolving, it’s interesting to first take a look at where it came from. 

The first PLC was brought to market in 1968 – the Modicon 084 – designed to meet General Motor’s requirement for a standard machine controller. It was a relatively simple operational technology (OT) electronic programmable device used to replace hard wired control relay circuits.

“The PLC provided a flexible design environment that was easy to modify and maintain with fast troubleshooting capabilities. It’s operation was such that it would cyclically scan inputs, execute its ladder logic program, then write to the outputs,” explained Dave Sutton, Producer Manager at Schneider Electric. 

By the early 1990’s technology had moved on and Programmable Automation Controller (PAC) platforms started to appear. “The PAC included additional functionality such as PID control, wider connectivity and networking, embedded web servers, more processing and data handling capacity, but essentially it was still a PLC with added features,” continued Dave.

In more recent times, manufacturers have faced ongoing demands and challenges to up their game – to manufacture in a smarter more agile way and to digitally transform their operations – to reduce waste, energy consumption and downtime whilst increasing overall effectiveness and productivity. 

This, says Dave, has highlighted a few key limitations of PLCs. These include: 

• PLCs are not IT ready: A wave of IT based IIoT technology is now being used alongside operational technology (OT) PLCs – artificial intelligence, machine learning, digital twinning, data analytics, for example. However, a fair amount of low-level engineering and integration is required to bring these technologies together.

• Single-node engineering:
Today’s PLCs and PACs are designed around IEC61131. Application programs are written for one node (one PLC) at a time, so a larger application distributed over a wider architecture, requires substantial low-level communications and network configuration to be managed separately.  

• Vendor lock-in: PLC hardware and the application software are bonded together so a user is effectively locked in to one vendor’s proprietary solution. 

• Obsolescence management: While manufacturers generally tend to offer assisted migration paths, a substantial engineering effort is usually required to migrate from a legacy PLC to a new platform. 

The future for PLCs
Dave went on to discuss two key influences which are defining how PLCs are evolving:

• Edge computing: The new generation edge controllers can sit alongside traditional PLCs, but increasingly common is to run both OT control and information technology (IT) data management applications on the same edge controller. Recent advances in edge control hardware and dual operating systems now facilitates the robustness, reliability and availability required to run real-time deterministic control on one core using a Linux operating system, while the IT data applications can run on a second core using a different operating system. This ensures full independence between tasks yet all are tightly integrated onto a single platform, simplifying architectures and engineering effort. The edge controller could take the form of a local industrialised PC, or a virtualisation running on a local micro data centre or remotely in the cloud. 

• IEC61499 standard: IEC61499 extends and enhances IEC 61131. It enables application portability across vendors while also decoupling software and hardware, facilitating holistic application design which can later be deployed to one or several nodes, including edge controllers, PLCs and variable speed drives.

Inter-node communication is managed transparently, which can drastically reduce engineering effort. Independent body, (UAO) manages the implementation of this approach and has created a shared run-time engine which is hardware agnostic and can be deployed on any UAO compliant device. This carries true for obsolescence management as application code can simply be re-deployed to new UAO-compliant hardware in the future.

“PLCs have come on a long way since their release in 1968,” said Dave. “Their future looks set to migrate towards edge controllers that facilitate native IT-OT convergence, powered by IEC61499 which is enabling hardware agnostic solutions obsolescence management and distributed architectures with minimised engineering effort.”

Widely used
Steve Ward, Director, Application Engineering EMEA at Emerson, explained that traditional PLCs have been central to factory-based manufacturing for a long time and are still widely used to control machines, assembly lines and processes. “Their widespread adoption is due to their ease of programming and installation and their speed to perform tasks,” he said.

On the one hand, Steve argues that the role of the PLC has not changed. “Industrial facilities still need a high speed, deterministic control device that is easy for mechanical and electrical engineers to program and debug, tightly integrates various sensors and actuators, motion controllers and safety equipment and with redundancy can provide high availability in critical applications.

On the other hand, he says that the PLC contains large amounts of information that may be useful to other disciplines within the organisation. “This includes machine utilisation data, quality data, energy usage data and much more. One possibility is for the PLC to support a modern communications protocol, such as OPC UA, to allow an external application to read the PLC data in a secure and timely manner. Data may also be written back to the PLC to improve and optimise operations. Neither of these uses has required a fundamental change in the nature of the PLC.”

Steve suggested that, to obtain edge functionality, a separate edge computing device could be added. The advantage of this approach would be that few changes are required in traditional PLC architectures and there is clear delineation between the OT-focussed PLC and IT-focussed edge device. The downside is that there is limited integration between the PLC and the edge device. You also need additional panel space and power supplies, and many users may not even consider how the data in the PLC could be used to best advantage because it is hidden from potential users. IT-focussed users may not even know what a PLC is or what it does.

He suggested an alternative future direction – to start performing edge processing tasks inside the PLC. “Users can programme the PLC in languages like Structured Text or C. However, these languages are not ideal and there remains a lack of connectivity, which limits what can be done. Some PLC vendors are considering using more modern and friendly programming languages, like Python, but the connectivity issue remains.

“Another approach could be to take advantage of modern multi-core CPUs and use hypervisor technology to partition a single physical device into two logical devices,” he continued. With Emerson’s PACSystems edge controllers, a quad core CPU is split into a PLC engine and an IPC running Ubuntu Linux, each using two cores,” In this scenario the PLC engine operates as before, providing high speed, deterministic control and utilising standard PLC programming and diagnostic techniques. The Linux partition of the edge controller runs Emerson’s PACEdge stack, which combines open source and Movicon software in an IIoT platform for development of edge applications. A virtual Ethernet channel, using OPC UA, links the PLC to PACEdge. Inside PACEdge, users can configure applications using open-source tools.

“A more radical future approach might be to dispense with the PLC entirely,” continued Steve.  “Computer power is increasing and even simple devices can contain a CPU and offer connectivity. It would be entirely feasible to have an intelligent actuator that receives data from one or more sensors, including from the Internet, and runs its own algorithm to determine the control function required. Consider an intelligent lightbulb that takes data from multiple switches, the time of day, room occupancy, and ambient lighting and decides whether to turn itself on or off and how brightly it shines, then imagine how this concept might work for PLC-based control.”

Defining edge computing
Holger Meyer, Director Control Systems at Phoenix Contact, dug deeper into the definition of edge computing. “With the help of edge computing, data is pre-processed close to a data source, at the ‘edge’ of the network which can reduce the load on the data center and can also enable a faster response to events on site. Physical distance often does not allow for fast reaction times, which may be required and so edge solutions offer a solution that means the data center does not have to move to the plant,” he said.

With data volumes set to continue to grow exponentially and real-time response times more often required, it is not possible to transfer all the data to a data center in the required time. With the edge component, sending all the data is not necessary, solving the problems relating to real-time requirement and bandwidth utilisation.

“Edge computing is playing an increasingly crucial role in manufacturing because it enables older machines without integrated network access, to be upgraded. In addition, real-time processing, network utilisation and machine control can all be performed in one device,” continued Holger.

He posed the question as to whether edge computing has to mean a completely different device, or whether a traditional controller could simply handle data from edge tasks too. “There are many possible variations, depending on the task,” he said in answer to this. “If data is only required from a few sensors, then most modern controllers can certainly handle this task. A modern controller is understood here as a product that allows corresponding integration options of IT software. This is because IT and OT can be ideally combined on one device. Open operating systems additionally round off the overall picture of the control system. If, in addition, the process of the plant is to be operated with high real time, then this is certainly reserved for only the most powerful controllers, or a separate unit.”

Today, the question about the role of traditional machine controllers, with the advent of edge computing, can be answered in many different ways depending on the application requirements. Luckily technology has advanced to the point that a solution will be available to meet practically every requirement. 

Contact Details and Archive...

Print this page | E-mail this page