New advanced vision sensors emulate human visual adaptability
Future autonomous vehicles and industrial cameras could have human-like vision, due to a recent advancement by researchers at the Hong Kong Polytechnic University and Yonsei University.
Machine vision systems must be able to “see” objects in a wide range of lighting conditions, which demands intricate circuitry and complex algorithms. Unlike our brains, current systems are rarely efficient enough to process large volumes of visual information in real time.
However, new bio-inspired sensors, developed by Dr Chai Yang of Hong Kong Polytechnic University and his team, are able to directly adapt to different lighting intensities instead of relying on backend computation.
“These new sensors will greatly improve machine vision systems used for visual analysis and identification tasks,” stated Chai.
Natural light intensity spans a range of 280 dB. While conventional silicon-based sensors have an effective range of 70 dB, the new sensors developed by Chai’s team have an effective range of up to 199 dB, exceeding the 160 dB range of the human retina.
The sensors reduce hardware complexity and greatly increase image contrast under various lighting conditions, thus delivering high image recognition efficiency.
Chai hopes that these sensors will be utilised in the next generation of artificial-vision systems used in autonomous vehicles, manufacturing, and edge computing.
The research was published in Nature Electronics.