An open-source AI tool for studying movement across behaviors and species

Movement monitor.


To figure out how neural circuits drive behavior requires the precise and enthusiastic following of conduct, yet the undeniably complex errands creatures perform in the research center have made that testing.

Now, a team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of Tübingen is turning to artificial intelligence technology to solve the problem.

They have developed a new software dubbed DeepLabCut that harnesses new machine-learning techniques to track features from the digits of mice to egg-laying behavior in Drosophila and beyond.

The software takes in new methods to track highlights from the digits of mice to egg-laying behavior in Drosophila and past.

The software tracks the movements of a fly laying eggs and the digits of a mouse.

In real, the software is the brainchild of Mackenzie Mathis, a Rowland Fellow at the Rowland Institute at Harvard; Alexander Mathis, a postdoctoral fellow working in the lab of Venkatesh N. Murthy, professor of molecular and cellular biology and chair of the Department of Molecular and Cellular Biology; and Matthias Bethge, a professor at the University of Tübingen and chair of the Bernstein Center for Computational Neuroscience Tübingen.

Using software to track animal movements was born partly of necessity. Both Mackenzie and Alexander Mathis had tried using traditional techniques, which typically involve placing tracking markers on animals and using heuristics such as object segmentation, with mixed success.

Scientists noted, “Such techniques are often sensitive to the choice of analysis parameters, and markers or tattoos are invasive and can hinder natural behaviors, or maybe impossible to place on very small or wild animals.”

Such algorithms, be that as it may, are broadly observed as information-hungry, requiring a great many marked cases for the artificial neural system to learn. This is restrictively extensive for normal research facility tests and would require long periods of manual naming for every conduct.

The arrangement came in what is classified as “transfer learning,” or applying an officially prepared system to an alternate issue, like the manner in which researchers accept natural frameworks learn.

Using a state-of-the-art algorithm for tracking human movement called DeeperCut, the Mathises were able to show that deep learning could be highly data-efficient.

The software relies on thousands of hours of experience and adapts them to recognize new objects. With that pretraining in place, the software needed only 100 examples of mice performing an odor-guided navigation experiment to recognize specific mouse body parts as well as humans could.

Mackenzie Mathis said, “We were very impressed by the success of the transfer-learning approach and the versatility of DeepLabCut. With only a few hundred frames of training data, we got accurate and robust tracking across a myriad of experimental conditions, animals, and behaviors.”

Alexander Mathis said, “Experimentalists have very good intuitions about what body parts should be analyzed to study a particular behavior, but traditionally extracting limb coordinates from videos has been very challenging — DeepLabCut does just that based on a few examples. Since the program is designed as a user-friendly, ‘plug-and-play’ solution and does not require any coding skills, it can be widely used.”

Bethge said, “We want as many researchers as possible to benefit from our work. DeepLabCut was created as an open software, as sharing results, data, and also algorithms is essential for scientific progress.”

The work is described in an Aug. 20 paper published in Nature Neuroscience.

Latest Updates