Getting the best out of your data

16 November 2020

Turning data into actionable information is vital to the success of any Industry 4.0 project. Suzanne Gill finds out what data analytic solutions are available today for both process and factory applications and gathers advice about successful integration of these solutions.

Manufacturing data produced daily in both process and discrete applications is growing exponentially. In the near-term, tremendous amounts of additional data – both structured and unstructured – will become available from both internal and external sources. Turning this raw data into useful insights requires operational technology (OT) and information technology (IT) decision makers to employ data analytics and management policies. 

Marcia Gadbois, president and general manager at ADISRA, points out that there are many manufacturing use cases for this data including predictive analytics, predictive quality, demand forecasting, inventory management, and warranty analysis. “These use cases rely on both historical data, where patterns or relationships are identified among the various data points, and real-time data, where factors having the greatest effect on yield are optimised,” she said.

She went on to explain that there five main pillars that decision makers should follow to garner insights from these various data sources. The first is to decide what information is needed to foster collaborative decision-making between IT and OT, the supply chains, lines, divisions, and plants. Simply put, find the information that identifies ‘what happened’? The next step is to seek insights from the data by drawing conclusions from the sources about ‘why it happened?’ These insights help drive data-driven decisions. 

The third pillar is project foresight to predict future outcomes by taking historical data and asking ‘what will happen next, and why’? The fourth pillar relates to data agility, or the assurance that data required by people or processes can be accessed no matter their location. It answers the question ‘how fast can the right person access the data needed to translate that information into action’?

Finally, the fifth pillar is to align data strategies with business objectives to adapt in real-time to market changes and innovation. Simply put, ‘what challenges need to be solved and where is the data to assist in making these strategic decisions’? 

“To answer these questions, data analytics and data governance must be utilised,” said Gadbois. “Data analytics is the use of data, statistics, and qualitative analysis to drive decision and actions. Data governance is the practice needed to insure the management of the various data sources. 

“ADISRA’s products focus on analytics in three main areas – descriptive, diagnostic, and predictive. Descriptive analytics use statistics to gather and visualise data for assistance in decision making regarding ‘what happened?’ Diagnostic analytics use statistics to find valuable insight to answer the question ‘why did this happen’? Finally, predictive analytics uses statistics to forecast ‘what will happen next’?”

Gadbois points out that, in the manufacturing data analytical journey, there are several necessary steps. First, define the problem to be solved and find the data required to solve the problem, whether it be historical, structured, or unstructured. Do not be afraid to pull the data out of the legacy systems and integrate the data into a new source. Despite having a long-term vision for your analytical journey, start small while continuing to grow the project. Identify the stakeholders and how they define success. Get buy-in from the business on what they want to see and use data governance to keep the business involved. 
Finally, measure in quantified ways and continually improve on the model as new information becomes available. 

Data, people and analytics
Michael Risse, CMO & VP at Seeq Corporation, cites three critical aspects for success when moving from data to data-based decision making in Industry 4.0 initiatives. These are data, people, and analytics. He said: “First and foremost is the data and ongoing access to it – in both legacy systems and industrial applications – because the best laid strategies will be tweaked and improved in the course of Industry 4.0 initiatives. No battle plan survives contact with the enemy!” 

According to Risse, the ability to start an advanced analytics project with the data where it is, in silos and different systems and of various types, is absolutely critical. “Plans beginning with assumptions about what data will be required, prerequisites for data movement or aggregation, summarisation are not compatible with the inevitable required changes and tweaks. Agility and adapting to change are the core of Industry 4.0, so starting with fixed expectations and expensive data transformation efforts before the benefits and proof of value from the achieved insights is the wrong way around.”

Risse says that a consistent finding in successful Industry 4.0 projects is the recognition and leveraging of employees’ skills. These people know the plants, processes, and procedures. In practical terms this means bringing innovation and abilities to current employees, which results in an increase in the organisation’s overall capacity for driving improved outcomes because insights and abilities are distributed, versus centralised far from the point of action. 

This insight may sound counter to all the attention paid to data scientists and machine learning. But, according to Risse, what the hype about data scientists misses is the fact they don’t know the plants, the assets, or the first principle model of how plants run – so their ability to find insights of value in a changing environment of raw materials, prices, and schedules is limited. The employees who know the plant best, on the other hand, know just what they need for improved outcomes, they just need improved advanced analytics software for easier and faster insights. 

Finally, with access to the data and the right people, it is time to bring them together and deliver innovation to those with the greatest abilities and needs. “Therefore, the imperative is bringing data science and innovation in analytics to the front lines of the workforce. Advanced analytics applications must wrap up and make accessible the innovations behind the scenes of the software, like the Google search bar wraps the MapReduce algorithm, or the Uber app integrates mapping, AI, and billing systems,” concludes Risse.

Breaking down analytics
According to Elinor Price, senior business development leader, Life Sciences and Specialty Chemicals at Honeywell Process Solutions, in the simplest form, the four types of analytics seen across manufacturing today are: 

• Descriptive analytics: What happened?
• Diagnostic analytics: Why did it happen?
• Predictive analytics: What might happen?
• Prescriptive analytics: Recommends action.

Price explains that the most common analytic – descriptive – is done every day, across every business. This is the summarisation of the existing raw data by using business intelligence tools to explain what is going on. This step makes raw data understandable to explain what happened. The most common techniques are data aggregation and data mining of the historical data.

“Looking into past performance to determine what happened and why something happened is the role of diagnostic analytics, which is used in principle components analysis (PCA), attribute importance and sensitivity analysis,” she said. “Switching from the past to the future leads to predictive analytics, where probabilities of occurrence of an event are forecasted using statistical models and machine learning. Descriptive analytics is the foundation of predictive models. Data scientists work with subject matter experts to tune these models for better prediction. Some of the newest analytics technologies being applied are machine learning algorithms, for example, advanced pattern recognition.”

Price continues: “The most advanced analytics type is prescriptive, that recommends one or more courses of action on analysing the data. Prescriptive analytics can recommend all favorable outcomes based on a specified course or action or can suggest various courses of action to attain a specified outcome.

Offering advice on successful integration of analytical solutions price said: “For any analytics data is paramount. If a holistic view of manufacturing is desired, it is important to break down data silos and combine data from disparate data sources into a single environment such as a data lake. Structured and unstructured data from the manufacturing process, equipment and business can all be stored in data lakes together, enabling increased insight across a wide array of stakeholders.

“When data is collected and stored across all the different systems as variables, attributes, measurements, events, etc., data contextualisation becomes a critical consideration. Any data contextualisation is the organisation of related data collected using metadata, which provides data about data. Contextualisation is important for providing a broader understanding of the pieces of data which are critical for analysis, aggregation, models and interpretation of that data.”

Key values
Jim Chappell, head of AI and Advanced Analytics at AVEVA, says that the greatest obstacle to realising Industry 4.0 is not generating and collecting data itself, but extracting value from that data. “The key value comes from data analytics solutions which provide context to large, complex sets of data,” he said. “Data is aggregated from previously inaccessible and disparate sources into a single source of truth. Through advanced data analysis and visualisation – using machine learning and advanced pattern recognition – actionable insights can be extracted. These analytics tools make it possible for people to take insightful and information-driven action to identify and solve problems at their source, before they compound into critical failure points that cascade into further problems.”

According to Chappell, predictive analytics solutions are among the most common tools adopted by enterprises embarking on a process of digital transformation. Existing historical data is analysed to understand an asset’s operational behaviour. Advanced pattern recognition and machine learning are then deployed to monitor the asset in real-time and identify anomalies in how the asset is performing. Potential operating issues are then identified, diagnosed and remediated days or weeks before failures can occur.  

Prescriptive analytics solutions add another layer of sophistication to predictive analytics. Once anomalies have been identified, these tools assess the potential impact and prescribe the most efficient action to prevent asset failures. Beyond that, prognostic analytics leverages more advanced AI to assess the future state of things, such as forecasting the remaining useful life of an asset.  Combined, these data analytics solutions enable organisations to predict asset failure, assess risk, and then prescribe the most economically advantageous actions to remediate potential asset failures.

“The quality of the data is integral to the success of Industry 4.0 projects,” continued Chappell. “Digital tools that use artificial intelligence capabilities, such as machine learning, are only as good as the data under analysis. Underlying the ability to execute a successful analytics strategy is the ability to manage and curate data to ensure quality, integration, accessibility and security.  Historian data harmonise and integrate multiple sources of data and ensure it is cleaned, accurate and structured to be analysed effectively.” 

Chappell advises that businesses should start their digital journey by implementing one or two digital solutions which can be modified and supplemented with others. 

Taking advantage
Shahin Meah, senior director, digital transformation and lifecycle services, Europe at Emerson says that customers are very interested in taking advantage of analytics to create production, operations and plant-level benefits.“Typically, they want to apply analytics to increase reliability, lower energy consumption, increase quality, and ensure safety,” he said.

“Data, and better yet, actionable data, can enable companies to bring industrial facilities to life with dynamic sensor and analytics networks to detect potential problems before they impact production or risk the safety of plant personnel. Plant workers are armed with the real-time insight to proactively assess the integrity of operating equipment, and target maintenance that minimises risk while ensuring business continuity.”

According to Meah, data-driven analytics, which predict behaviour from statistical analysis, should be familiar to those tasked with making operational improvements. This form of analysis has been deployed for many years, but we have seen an exponential rise in computing power, while data storage costs have reduced, and algorithms have become much more sophisticated. We now have the capability to use machine learning within this analysis that can remove the need to programme everything a machine does. 

He said: “Operational analytics – with embedded domain knowledge can impact and improve performance of simple equipment, complex assets, process units, and entire production plants – can present a massive opportunity for manufacturers and processing companies. 

“Failure mode effects analysis is another form of principles-driven analytics which is used widely to predict or prevent over 80% of known failure modes. Emerson, for example, has built almost 500 failure mode effects analysis models for the common assets found on a plant. We know the data to collect and the algorithms required to interpret that data into actionable information. End users simply need to decide whether it is worth making an investment to obtain the data and establish a digital repeatable way of making improvements.” 

Meah points out that it is essential for processing and manufacturing plants to leverage existing infrastructure, systems and instrumentation in order to achieve scalable success when integrating analytic tools. This requires the use of analytics solutions that are designed to securely connect and extract data from legacy systems and instrumentation. It is also important to have secure access to field device data residing within the process control system and any newly added monitoring and optimisation hardware and software. 

The NAMUR Open Architecture (NOA) is a standard system architecture specifically designed to support digital transformation initiatives without compromising plant cybersecurity (availability, integrity, confidentiality) and safety. NOA adds to the existing automation architecture and is based on existing standards such as fieldbus protocols and standard software application interfaces (API), which enable less complex integration of digital components from the field level up to the enterprise level. 


Contact Details and Archive...

Related Articles...

Print this page | E-mail this page