Machine vision goes 3D


Industrial machine vision is fulfilling increasingly complex tasks, but it is also becoming increasingly easy to use.

The automobile industry is an essential driver of innovation in machine vision. It often requires precise 3D measurements because of the need to link new and complex process chains optimally with each other. Ease of operation and fast integration into the process environment are important success factors that can smooth the way into everyday automation.

This also applies to 3D scanners which should do more than act as simple image recorders or profile generators.

Suppliers are starting to integrate specially developed methods into 3D cameras, which provide an optimum of scan rate and image quality. For example, pre-processing modules implemented in hardware can reduce data volumes, taking the load off the evaluation PC. Application programming interfaces simplify integration into the user world, and integration in various machine vision libraries simplifies operation.

Structured-light 3D scanning is already a common optical method to obtain depth information about an object. To this end, various patterns composed of illuminated and unilluminated stripes are projected one after another on an object and recorded by a camera. Using triangulation, it is possible to calculate depth information and reconstruct the object spatially.

Such systems pave the way for new applications, especially in mobile 3D digitalisation of filigree components of smaller to medium sizes. Especially robust structured-light 3D scanners exist for larger components which can even be adapted to customary robot systems.

A comparably new 3D measurement procedure is the optical time-of-flight (ToF) technology, which is competing with previous sensor technologies such as radar and ultrasound or serving as a supplement to them.

Three methods are already being used in sensors and customer applications – the Continuous Wave (CW) method and the direct and indirect pulse methods. While the CW method is based on determining the phase shift of sent and received waves, the direct pulse method uses the time measure between the sent and received pulse.

In indirect time-of-flight measurements, a single light pulse is shot at the object to be detected, and the returning pulse is measured via two differently long integration windows.

In another measurement, the background is detected and eliminated “on chip”, which protects use in unfavourable light conditions.

These developments and more can be seen at the Machine Vision sector of AUTOMATICA 2012 from May 22-25 in Munich. AUTOMATICA covers all areas of robotics and automation every two years.

For more information: