Jump to content

MIT speeds up its futuristic lensless single-pixel camera by 50 times


CrAKeN

Recommended Posts

MIT-Lenseless-Image-640x353.jpg

 

Traditionally, cameras achieve resolution by having many pixels spread across space — the surface of the sensor. They also use a lens to focus the image on that sensor. Single-pixel imagers turn this approach on its head, by using randomly patterned light to illuminate the scene, and recording the result on just one pixel. By repeating this process many times, it’s then possible to use computational imaging to construct a traditional photograph of the original scene.

 

Compressive sensing allows cameras without lenses


One interesting property of compressive sensing cameras — those which use a limited number of pixels through a specialized aperture, or in this case, just a single pixel — is that they don’t require lenses. Each pixel simply records the sum of reflected illumination that passes through the aperture and lands on it during each lighting interval. Since lenses are often the largest, heaviest, and most expensive component of an imaging system, solutions that eliminate the need for them are of great interest. The downside has been that it can take hundreds or thousands of individual captures to produce a reasonable-quality final image. That means the process is slow, and is only suitable for stable subjects.

 

singpix3dimg-640x353.jpg

 

Why compressive sensing is important


Obviously, for most applications, traditional cameras and lenses are great. But in harsh environments, needing a lens creates design challenges. And for non-visible light applications, a low-resolution sensor can be much easier to construct. Practical applications of compressive sensing will likely use more than a single pixel. But the single-pixel model has become sort of a benchmark among researchers for the state of the art in research.

 

Among the use cases for compressive sensing is medical imaging. The high cost of equipment and the negative side effects of prolonged exposure to radiation mean that minimizing the amount of information that needs to be captured, and maximizing the amount that can be reconstructed computationally, is valuable.

 

Adding time of flight makes the process 50 times faster


Rice University’s early design for a single-pixel imager required thousand of exposures. Researchers from MIT’s Media Lab have published results reducing that by a factor of 50 — perhaps just dozens of images. They achieve this by using quick pulses of light and time of flight sensors to record when they return to the camera. That gives them a time series of data for each pulse, consisting of the reflection from all the elements of the scene at a specific distance. As a result, they get far more information to use for computation than if they only recorded the total light reflected from the scene. Using time of flight sensing to capture images isn’t new, but the team at MIT says that they’re approach of combining it with compressive sensing and structured light is novel (although I did find a paper published in Nature around the same time that discusses the use of those technologies in a similar fashion).

 

To accomplish this requires some high-end hardware, including picosecond accurate time of flight sensors. However, that’s likely to change. For example, I saw an impressive project at Stanford that uses much-less-expensive micromirrors, coupled with amplitude modulation, to create similar efficiency in image capture at a substantially lower cost.

 

For another esoteric camera design, read our article about one that can essentially see in the dark.

 

Note: MIT uses the term “compressed sensing” to describe this technology, an alternate to “compressive sensing.” You can read MIT’s announcement, as well as an earlier version of the MIT paper on using time of flight with compressive sensing.

 

Source

Link to comment
Share on other sites


  • Views 285
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...