One of the most challenging tracking scenarios for computer vision is quantifying the behavior of small animals traveling across large distances in complicated settings. In chaotic and dynamic situations, small, low-contrast foreground objects must be localized, and in many long recordings, camera motion and drift must be corrected for in the trajectories.
An international research collaboration involving the University of Sheffield has developed new tracking technology called CATER that offers new insights into how desert ants navigate their complex worlds. The technology uses computer vision and AI to track individual desert ants over their entire foraging lives.
It combines an unsupervised probabilistic detection mechanism with a globally optimized environment reconstruction pipeline enabling precision behavioral quantification in natural environments. The tool documents an ant’s journey from when it first leaves its nest until it finds a food site and returns to its colony.
The system can even detect tiny objects challenging to see by the eye and is robust to background clutter, obstructions, and shadows, allowing it to function in the animal’s natural habitat where other systems fail.
The ants learn very quickly, memorizing their homeward paths after just one successful trip, according to their new dataset. However, intriguingly, their outward trajectories changed throughout time, indicating various exploration versus exploitation methods. The highly accurate data also revealed an unseen rhythmic activity beneath the surface, which may help explain how ants create intricate search patterns according to the circumstances.
Many worldwide research groups are already using the new software. It is perfect for citizen science initiatives because it works with various species of animals and uses video shots with common cameras. The precise information acquired is essential for comprehending how brains let animals navigate their intricate environment and could serve as inspiration for a new breed of bioinspired robots.
Dr. Michael Mangan, Senior Lecturer in Machine Learning and Robotics at the University of Sheffield, said: “We captured this data during a summer field trip, but it has taken 10 years to build a system capable of extracting the data, so you could say it’s been a decade in the making.”
“I’ve always been fascinated by how these insects can navigate long distances – up to 1km – in such forbidding landscapes where temperatures are over 50 degrees Celsius.”
“Up until now, desert ants have been tracked by hand using pen and paper, which involves creating a grid on the ground with string and stakes and monitoring their behavior within the grid. Another method used to get around this is a Differential Global Positioning System (GPS) – but the equipment is expensive and low precision.”
“The lack of a low-cost, robust way to capture precise insect paths in the field has led to gaps in our knowledge about desert ant behavior. Specifically about how they learn visual routes, how quickly they do so, and how strategies they employ that might simplify the task.”
Dr. Mangan said: “Desert ants are the ideal inspiration for next-generation robots – they navigate over long distances, through harsh environments, and don’t rely on pheromone trails like other ants, or GPS and 5G like current robots.
“We hope that our tool will allow us to build a complete picture of how insects learn to pilot through their habitats, bringing new scientific knowledge and informing engineers about how they could build similarly capable artificial systems.”
- Lars Haalck, Michael Mangan et al. CATER: Combined Animal Tracking & Environment Reconstruction. Science Advances. Link to paper.