It’s all in your head

01 November 2007

Print and packaging companies have looked to the human brain for tips on how to boost optical character recognition (OCR) in machine vision systems. Earl Yardley, director of Industrial Vision Systems, explains.

Example of OCR used in pharmaceutical inspection criteria on miniature needle tip sub assemblies
Example of OCR used in pharmaceutical inspection criteria on miniature needle tip sub assemblies

The latest generation of machine vision solutions utilises neural network classification to provide OCR and product identification in print and packaging applications. Neural networks imitate some of the principles of information processing in the brain and are adaptable for machine vision processes.

The information processing power of the brain stems from neurons and interconnections between rather simple processors. Similarly, an artificial neural network is constructed from many simple units, connected by many weighted-links. This architecture allows the realisation of practically arbitrary transfer functions, i.e. relationships between input and output signals. The transfer function actually created depends on the weights of the internal connections. One of the main advantages of neural networks is that it is not necessary to construct this function explicitly.

Training algorithms enable neural networks to derive the weights needed to create the desired relationship between input and output from a set of training patterns. Machine vision software applications, such as NeuroCheck, use networks of the multi-layer perception type. They consist of three layers of processing units. The first layer receives input signals and transfers them to the second layer. The second layer or ‘hidden layer’ does the actual processing of the signals. Figure 1 shows a simple network with five inputs, four hidden units and three outputs. Typical networks for digit classification have between 100 and 300 inputs, 10 to 50 hidden units and 10 output units (one for each digit).

A training pattern consists of an input signal, i.e. a collection of feature values or templates for character recognition, and the correct class information for the object described by the feature values.

Constructing a neural network application
A precise specification of the classification task, usually defining the required classes, is needed. The next step is to choose the features describing the objects. The features have to represent properties of the objects essential for distinguishing the classes. Apart from this requirement the feature selection largely depends on the specific problem and may have to be revised, if it turns out that the objects cannot be recognised reliably using the selected features. In the next step training data has to be generated.

With NeuroCheck software, network configuration restricts itself to setting the number of hidden units, because input and output configurations are already given by the problem specification. Afterwards the network can be trained.

Finally the classifier has to be tested, preferably with a set of pattern data not used for training. A possible reason for an unsatisfactory recognition rate in the test is an inappropriate selection of features, which should be revised in such a case.

Using classifiers
Classification tries to model aspects of human reasoning, therefore it is a complex subject. NeuroCheck makes applying a classifier to a problem as easy as possible, performing many of the tasks necessary for creating and using classifiers automatically. Nevertheless a specific procedure has to be observed and some thought has to be invested in how to make the best use of this technology.

When to use a classifier
Not every problem needs a classifier. A screening process will be sufficient if the distinction between objects of different classes can be made by comparing some measurements to certain thresholds. As soon as there are more complex, possibly non-linear, relationships between measurements one might think of using a classifier.

Contact Details and Archive...

Related Articles...

Print this page | E-mail this page