When you try to take pictures under lower light conditions, you are largely left with one of two options. You can use the flash and get all sorts of unnatural and uneven lighting. You can avoid the flash and get one big blurry mess. Well, a couple of students are coming up with a much better alternative.
Some people are calling it a “dark” flash, whereas others are referring to it as an “invisible” flash. Whatever you choose to call it, the innovation is supposed to provide us with much better photos at night and under other dim lighting conditions.
Hailing from New York University, Dilip Krishnan and Rob Fergus are developing a two-step technique that can be completely automated in regular digital cameras (and maybe even camera phones). The flashbulb has been modified to emit a wider spectrum of light, but it filters out visible light.
The UV and IR filters that are normally present in camera sensors have also been removed. The net result is a picture that looks like an infrared image, similar to the picture you see on the left. The blur is gone and the lighting is even, but it’s the wrong color, right?
An algorithm takes care of that. A second photo is taken immediately after the second one, but without the “dark” flash being used. By doing so, the camera is able to grab the color information from the resulting grainy and shaky pic. Combining the detail from the first pic with the colors of the second, you get the picture on the right.
This sounds like it could be quite a fantastic innovation if it really works as promised, but it almost sounds like the camera has to be dedicated to this purpose. If they can merge the technology with existing tech for “regular” photos, they could have a very lucrative patent on their hands.