Out of the box vision solutions

11 April 2022

Neil Sandhu explains how integration of machine vision solutions into a wide range of applications is now a much easier task than it has traditionally been.

A machine vision automation project is not something to undertake lightly. Until recently, it would have required expert programming and specialist knowledge. It could take months, and significant investment. The final rewards could be great, but the journey would often be long and hard. Some projects would inevitably stall, as the more cautious feared ending up with a system that did not fulfil its initial expectations.

Now, in just a few years, developments in both hardware and software have opened an accessible road to rapid integration. While many more end-users can benefit from off-the-shelf solutions for common tasks, experienced machine builders radically reduce their development time and cost. Even a few years ago, this would have been unthinkable.

Hardware innovations – such as CMOS – alongside a broadening choice of technologies including 3D time-of-flight snapshot and stereovision, have widened the options to generate high-quality image data. Vision sensors now have processing power onboard so they can be programmed to run applications on the device, or through more localised, edge integrations.  

Sensor apps
Sensor manufacturers have become software and solution providers. SICK, for example, has developed the AppSpace software ecosystem  which aims to make it easy to download a ready-made “App” to configure a programmable 2D or 3D vision sensor.

Through AppSpace, crucially, everyone benefits from a shared experience. The groundwork done for specific customer applications has been exploited to create more general, repeatable ‘Apps’. With ready-made, all-in-one, application-specific solutions, there is no longer a need to ‘reinvent the wheel’. 

Over the past 18 months, SICK has released Apps for a range of common vision applications, as well for more complex classifications using deep learning. These are now even available on high-performance cameras, such as the SICK Inspector P611 that can fit in the palm of your hand.

All SICK’s 2D Inspector P programmable vision sensors come with a ready-to-use Quality Inspection App, radically shortcutting set up time. Now SICK is going even further to develop ready-made ‘plug and play’ systems that arrive in a box, along with all the necessary hardware, ready to be powered up. 

The options extend to common robot guidance tasks such as belt or bin-picking. For static and mobile machines, there is a simple and direct connection to the robot controller or PLC. Systems are quick to teach and ready to use. For example, SICK’s Belt Pick Toolkit App software, installed on the Trispector P programmable 3D vision camera, turns into a stand-alone, belt-picking sensor for both industrial and collaborative robots, including robot brands such as Universal Robots.  

With SICK’s Pallet Pocket Detection SensorApp, the measured values required for an Automated Mobile Robot (AMR) to pick up a pallet or dolly are pre-processed and evaluated on the sensor, then transmitted directly to the vehicle control. 

With new SensorApps being introduced all the time, many common inspection, robot picking, and AMR navigation tasks can now be set up in a fraction of the time it took previously.

Ending up with a successful system will still demand careful planning and it is advisable to consult a manufacturer with experience and a broad portfolio. However, now there is more opportunity than ever to reap the rewards of greater inspection accuracy, repeatability, reduced waste and increased machine availability that an automated machine vision system can bring.

Neil Sandhu is UK product manager for Imaging, Measurement and Ranging at SICK. He is also Chair of the UK Industrial Vision Association (UKIVA).


Contact Details and Archive...

Related Articles...

Print this page | E-mail this page