Home » Leading Edge Robotics News » Gaining Insight on Robotic Vision From Insects

Gaining Insight on Robotic Vision From Insects

University of Adelaide Ph.D. student Zahra Bagheri and supervisor Professor Benjamin Cazzolato with a vision system that uses algorithms based on insect vision. (Photo credit: University of Adelaide.)
University of Adelaide Ph.D. student Zahra Bagheri and supervisor Professor Benjamin Cazzolato with a vision system that uses algorithms based on insect vision. (Photo credit: University of Adelaide.)

A new research project seeks to improve robot vision by applying findings from research on insect and human sight. As a part of their work, the researchers built a virtual reality simulation, in which an artificial intelligence system pursued an object. The findings were recently published in the Journal of The Royal Society Interface.

“Detecting and tracking small objects against complex backgrounds is a highly challenging task,” says the lead author of the paper, Mechanical Engineering PhD student Zahra Bagheri of the University of Adelaide.

“Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd – all while running or even diving towards the point where they predict it will fall!

“Robotics engineers still dream of providing robots with the combination of sharp eyes, quick reflexes and flexible muscles that allow a budding champion to master this skill,” she says.

Research conducted in the lab of University of Adelaide neuroscientist Dr Steven Wiederman has shown that flying insects, such as dragonflies, show remarkable visually guided behaviour. This includes chasing mates or prey, even in the presence of distractions, like swarms of insects.

“They perform this task despite their low visual acuity and a tiny brain, around the size of a grain of rice. The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97%,” Ms Bagheri says.

The team of engineers and neuroscientists has developed an unusual algorithm to help emulate this visual tracking. “Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it,” Ms Bagheri says. “This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal.”

This bio-inspired “active vision” system has been tested in virtual reality worlds composed of various natural scenes. The Adelaide team has found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.

“This type of performance can allow for real-time applications using quite simple processors,” says Dr Wiederman, who is leading the project, and who developed the original motion sensing mechanism after recording the responses of neurons in the dragonfly brain.

“We are currently transferring the algorithm to a hardware platform, a bio-inspired, autonomous robot.”

Leave a Reply

Your email address will not be published. Required fields are marked *