New Camera System Inspired By Animal Vision

Scientists have taken inspiration from how animals’ eyes work to create a new way for computer-controlled cameras to ‘see’. They have found a way to instruct cameras to prioritise objects in images using a method similar to the way brains make the same decisions.

New Camera System Inspired By Animal Vision
Image credit: University of Glasgow

Inspired by animal’s eyes, scientists have created a new camera system i.e., a way for computer-controlled cameras to ‘see’. Scientists from the University of Glasgow researchers discovered a way to instruct the camera to prioritize objects in images. They used a method similar to the way brains make the same decisions.

The eyes and mind of human or animals work in tandem to prioritize specific areas of their vision. Usually, their visual attention is focused on the other speaker, with less of the brain’s ‘processing time’ given over to peripheral details.

Dr. David Phillips, research led said, “Initially, the problem I was trying to solve was how to maximize the frame rate of the single-pixel system to make the video output as smooth as possible.”

Scientists used a sensor uses one light-sensitive pixel to construct moving images of objects placed in front of it. This single pixel sensor is much cheaper than dedicated megapixel sensors. It also holds the potential for capturing images at wavelengths.

This camera system captures images in a square with the overall resolution of 1,000 pixels. It can allocate its ‘pixel budget’ to prioritize the most important areas within the frame. This is possible because it places higher resolution pixels in these locations and so sharpening the detail of some sections while sacrificing detail in others. This pixel distribution can be changed from one frame to the next, as the biological vision systems work.

“By prioritizing the information from the sensor in this way, we’ve managed to produce images at an improved frame rate but we’ve also taught the system a valuable new skill,” he added.

“I started thinking about how vision works in living things and I realized that building a program. It could interpret the data from our single-pixel sensor along similar lines could solve the problem. By challenging our pixel budget into areas where high resolutions were beneficial, such as where an object is moving, we could instruct the system to pay less attention to the other areas of the frame.”

Now scientists are finding ways to improve the system and explore the opportunities for industrial and commercial use.

YOU'LL ALSO LIKE