This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Using data to predict the future

29 April 2018

Jan Larsson and Ravi Shankar argue that predictive engineering analytics is about much more than just big data.

Big data has been hailed as ‘the new oil’; a frontier that you can mine for insights and forecasts. We have moved from looking at data that tells us what happened, to why it happened, and on to what might happen next.

Predictive engineering analytics combines physics-based simulations with data mining, statistical modelling and machine learning techniques, using patterns in the data to build models of how the systems the data was gathered from work. With such models, it is possible to find out what the data you have can tell you about the data you don’t yet have.

IoT and sensors are already transforming products. Mining the stream of information from products will be critical for maintaining products and for designing their replacements – but that's not the only part of product development where predictive engineering analytics matters. And if you are thinking about predictive engineering analytics as relying only on big data then you are missing out on some of the key opportunities. 

For many industries, the products created are no longer purely mechanical - they are complex devices combining mechanical and electrical controls, and functioning in ever more complex environments. That means engineering different systems, and the way they interface with each other, and with the outside world. At one level you are coping with electromechanical controls, at another you are creating a design that covers the cooling requirements for the electronics. And in future, you have to model that as part of a larger systems. For example, systems inside a vehicle will begin to talk to other vehicles and to traffic systems on the roads they travel on.

One consequence of this increasing complexity is that testing during engineering has been routinely supplemented by and, in some cases, even replaced by, simulations that cover multiple systems, and take into account the many different types of physics needed to model all the systems. This is valuable during design and in acceptance testing too. Either the physical product design or the demands of the location of the finished product may make it impossible to gather readings from a physical sensor to verify final performance. This is when a virtual, simulated sensor can augment the information from the physical device and enhance the usefulness of the test.

Adopting new materials
On the other hand, demands for fuel efficiency or simply more efficient manufacturing may mean adopting new types of materials and new production techniques, instead of relying on well-known ones. Companies who have decades of experience with traditional materials such as steel and aluminium have to learn to work with new materials, often using additive manufacturing and combined additive and subtractive manufacturing. That means going back and doing physical tests and correlating those tests to simulations to understand things as basic as how materials behave at a range of temperatures and what impact that will have on the system design.

To address all these demands, companies need to integrate their testing methods and their simulation methods, and need to adopt more simulation and much more data management to accelerate the speed at which engineering work is performed.

This goes far beyond tracking requirements, CAD data and test results; engineering data management systems need to store all the engineering work, including simulation and verification, and integrating test, sensor and performance data.

That becomes even more important as the trends of mass customisation and personalisation increase, making it impossible to test all the different versions of a product in all potential environments. Instead simulations can be used to get broader coverage of all variants and usage scenarios. That allows you to go back at any point and prove that you verified a component with all the relevant systems, to look at the full range of data and see what might have led to a failure, and to use predictive analytics to forecast how products will perform.

The mathematical approach of big data analysis is certainly useful. But applying predictive analytics to the engineering space needs an approach that combines test data and physics-based simulation data in a common database environment (and not just a single type of physics but multiple types) to allow engineers to take very large test and simulation data and use them to create key performance metrics.

Predictive engineering analytics also covers exploring the design space efficiently by running multiple simulations with different parameters and analysing the resulting data intelligently, so you can understand key parameters and how they interact. This enables a design to be optimised to achieve robust performance that is not sensitive to changes in the environment.

Predictive analytics might even move into products, as control systems shift from detecting to forecasting conditions. Today, anti-lock brakes use sensors to detect when the car is beginning to skid. In the future, sophisticated control systems could use on-board cameras to detect that a vehicle driving in the rain is approaching a curve too fast for the road conditions, predicting that it will skid and controlling the vehicle before it does, to handle the curve safely.

Drowning in information
The size and scope of the data sets available enables advanced analytics, but this size also has consequences: engineers will be drowning in information unless they take steps to make big data manageable. The number of sensors in products today is only going to increase. Sensor readings already create huge data sets – too large to use the raw data in simulations, because they’d take too long to process. The scale of the data is going to require intelligent analytics to condense raw streams of readings into data that can usefully be fed into a simulation.

Predictive analytics is already proving useful in engineering. It can be used with the simulation portfolio to investigate different architectures early in product development, to help understand which type of architecture is best suited to meet customers’ needs. 3D simulations can be created and integrated with those early architecture models. Then you can bring in test data and see how it correlates with the simulations to improve models and increase system fidelity. As software tools develop it will be possible to simulate new types of physics to improve the accuracy of models, simulate large systems and more complex models, and to take advantage of more analytics tools.

Analytics can also be integrated into other tools. Having the information from those simulations in a central information system enables teams beyond the engineering organisation to understand architectural decisions and use that information in their own processes.

Combining data from devices with warranty information and data about customer satisfaction in a common data store allows big data analytics to be used to understand the significance of different factors in the design of the device that ultimately have an impact on business success. In a PLM environment, it is possible to integrate many different types of data in a more connected way; as the models, simulations and test results are updated and new customer information comes in, the analytics can take account of those to give an up to date view of the situation.

Many companies are already taking advantage of one or more of these opportunities. But to really make the most of predictive engineering analytics, it is necessary to look at this more holistically. Bringing all the information sources together in a more integrated system enables it to be applied to new and emerging technologies as well as existing tools, getting more value out of the data and allowing for the more confident creation of products that perform well in complex environments.  

Jan Larsson is senior marketing director EMEA at Siemens PLM Software. Ravi Shankar is director, global simulation product marketing at Siemens PLM Software.


Contact Details and Archive...

Print this page | E-mail this page