This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Integrating Vision and Robotics

23 July 2010

On this Italian production line for cosmetic powder cases, the challenge was to program two robots to pick up face powder brushes from random piles and then accurately place them in an eight-slot shuttle. The robots work side by side, each with its own image acquisition system, and each picking up four brushes.

The robots on Vetraco’s cosmetic powder case production line must accurately place a face powder brush in the dedicated fixture behind the robots. The shaker table is in front, on the right.
The robots on Vetraco’s cosmetic powder case production line must accurately place a face powder brush in the dedicated fixture behind the robots. The shaker table is in front, on the right.

This application involves intense coordination between machine vision and robot movement. The cosmetic brushes enter the feeder table in random locations and so it is the job of the machine vision system to identify their location and orientation and transfer this information to the robot, which must then get into the proper position to pick up the brush.

A further challenge was the desires of the cosmetics retailer: he wants to have different sizes and shapes of his product, and that means different size brushes and boxes to put them in. The system has to be flexible and easy to use for the line operators.

The companies involved in this development in Italy were Vetraco, which produces complete assembly and packaging lines for cosmetics, and machine vision developer ImagingLab.

80 pieces per minute

‘To achieve the rate of 80 pieces per minute, we installed two ‘twin’ robotic stations,’ explained Ignazio Piacentini of ImagingLab. The brushes are loaded on a programmable feeder (Anyfeeder by FlexFactory) that shakes to spread them apart and make them available for picking. The Anyfeeder is interfaced to the work cell via a National Instruments’ LabVIEW library.

The vision system is based on National Instruments’ image acquisition board, which uses IEEE 1394 (‘Firewire’) to interface with the camera. The board has 29 lines of reconfigurable digital I/O to communicate triggers and results. The system uses an AVT 1,400 by 1,000 pixel CCD camera, and an ImagingLab custom infrared illuminator.

The solution includes two DENSO SCARA robots with two vision systems, each with an NI image acquisition board. All the programming for the entire robotic cell was done with LabVIEW and the NI Vision Development Module.

The system acquires the images of the brushes, determines their positions, and communicates the coordinates for the part picking to the robot. If there are no parts available for picking, the system shakes the Anyfeeder to deposit more brushes under the camera.

Even the shaker is endowed with a certain amount of ‘intelligence.’ Using information from the vision system, it chooses between four possible shaking modes — shake forward, shake backward, shake neutral, and load more parts.

‘We place the brush boxes’ eight-slot trays in front of the robot at fixed positions,’ explains Mr Piacentini. ‘The brushes can have a random position on the feeder and the vision system has to localise the position and orientation of each part because every brush must be placed in its box with the correct orientation.

‘Additionally, the robot has a multiple gripper for four brushes so the vision system has to identify and locate the position of four brushes at each picking cycle,’ he said. When a tray becomes full, a new one arrives. The robotic cells work 24 hours per day, seven days per week, rejecting any defective parts.

Tight vision, robot integration

The vision system guides the robots and provides quality control on the parts by measuring the dimensions and verifying the integrity.

‘Using the ImagingLab Robotics Library for DENSO, we implemented tight vision and robotics integration,’ says Mr. Piacentini. As a result, the user can calibrate imaging and robotics with only one operation.

‘By adopting the LabVIEW platform, we successfully programmed, prototyped, and tested new robotics applications very quickly. Another great advantage of using the LabVIEW platform was the ability to design and customise the graphical user interface,’ says Mr. Piacentini.

‘Because the same LabVIEW program will run on devices ranging from PCs to NI Smart Cameras, we have a lot of flexibility in choosing NI hardware for our systems. We can also reuse parts of the hardware and equipment from past projects. Moreover, the NI standard allows fast product integration, which shortens project development time and cost.’
 
To view the robots and Anyfeeder in action, click here.


Contact Details and Archive...

Related Articles...

Most Viewed Articles...

Print this page | E-mail this page