Researchers in Canada and the US have developed a new 3D imaging technology that — unlike other depth sensing systems — can work in bright light, because it only gathers the light it actually needs to create the image.
Invented by researchers at Carnegie Mellon University and the University of Toronto the system uses a mathematical model that optimizes the way the camera and its light source work together, eliminating any light that might make it challenging to spot contours. The work was recently presented at SIGGRAPH 2015 in Los Angeles.
“We have a way of choosing the light rays we want to capture and only those rays,” said Srinivasa Narasimhan, CMU associate professor of robotics. “We don’t need new image-processing algorithms and we don’t need extra processing to eliminate the noise, because we don’t collect the noise. This is all done by the sensor.”
One of the prototypes developed for the project uses laser projector and a rolling-shutter camera like those found in smartphones. The camera only detects the laser light as it scans, and bounces off of elements in the scene. The whole system is coordinated, so that the camera only accepts light from that same plane that the laser is scanning. This unique setup allows it to capture brightly lit objects, such as an illuminated light bulb, which other systems would not be able to accurately depict.
“Even though we’re not sending a huge amount of photons, at short time scales, we’re sending a lot more energy to that spot than the energy sent by the sun,” Kyros Kutulakos, U of T professor of computer science and fellow researcher on the project. The trick is to be able to record only the light from that spot as it is illuminated, rather than try to pick out the spot from the entire bright scene.
The researchers say the device could be useful for videos games, self-driving cars, medical imaging and even space research. William Whittaker, University Professor of Robotics at CMU, said the system could be particularly useful in our moon’s polar regions, where eliminating glare is essential.
“Low-power sensing is very important,” Whittaker said, noting that a robot’s sensors expend a relatively large amount of energy because they are always on. “Every watt matters in a space mission.”