We’ll briefly describe the structured light and time-of-flight principles and then describe the unique characteristics of this system when combining both methodologies. This is then followed with early experimental results confirming it’s operation. (For more background, Wikipedia provides a good primer on some of the fundamental principles of this class of non-contact optical 3D scanning: 3D scanners.
The following image encompasses the principle of operation of Alces’ hybrid structured light system. The structured light system is based on “phase-shift” methodologies which uses 1D sinusoidal patterns to uniquely encode each projected column with a specific “phase” value. In the example shown in the image four sinusoidal patterns are supplied to the projector and continuously projected in a 4-pattern loop.
Let’s walk through this figure step-by-step to understand how this measurement takes place:
Phase-shifting Structured Light Fundamentals:
- Four input patterns are supplied to the projector. Each pattern is shifted, spatially, by 90 degrees, thus after four patterns, the sinusoidal intensity variation is shifted by 360 degrees and the pattern cycle is repeated.
- A structured light system uses triangulation computations to determine the XYZ position of a point out in space. Therefore, the projector and camera are spatially separated along one axis and the intersection of the ray associated with each camera pixel and plane associated with each column of the projector can be used to create a unique correspondence between the angles associated with the pixels on the camera and projector.
- From the cameras point-of-view each pixel sees a unique time-varying signal. This is perhaps the most important principle to understand because the spatially varying pattern from the projector has become a time-varying pattern at the camera pixel. The time-varying signal at each pixel has a unique phase. The phase value contains the angle associated with that column of the projected pattern, which is the fundamental piece of information needed to perform the triangulation calculation and determine the XYZ position of a point out in space.
- The phase of time-varying signal is calculated with some basic trigonometry. Each column from the projector can thus be determined by calculating the phase value seen by the camera pixels.
A Time-of-flight camera does not use a spatially-varying sequence of patterns but instead relies solely on the modulation of a point-source, such as an LED or laser, to create a time-varying signal. The time-of-flight camera contains a unique image sensor with pixels capable of “phase sensitive detection." These pixels operate at very high rates, typically >20MHz, which allows them to essentially interpolate the phase delay of the light as it’s sent out and reflected off an object out in space. The following figure illustrates this concept. The phase delay between the source signal (blue) and the received signal (red) is used along with known speed of light to determine the Z-position of the object out in space.
Alces’ patent application describe a hybrid approach combining elements of the Structured Light and Time-of-flight systems. By combining the high-modulation rates of the Time-of-flight approach with the resolution of the Structured Light approach, an improved system can be designed with new features and broader application base. Such as system is theoretically capable of very high frame rates, ambient light rejections, and very precise depth resolution.
Alces is still in the early stages of the development of this type of sensor system but with the explosion of the Microsoft Kinect and a burgeoning marker around gesture and New-user interfaces (NUI) this type of technology has great potential. We will be discussing additional details in the future but are always open to further inquiries in the meantime.