New sonar-equipped glasses use AI to interpret upper body poses in 3D

A game-changer in wearable body-sensing technology.

Share

Follow us onFollow Tech Explorist on Google News

Now, sonar’s distinctive “ping” is used to map oceans, spot enemy submarines, and find sunken ships. Now, Cornell scientists use the technology in wearable.

Scientists have created PoseSonic, an intelligent acoustic sensing solution for smart glasses that estimates upper body poses. This wearable uses artificial intelligence (AI) and inaudible sound waves to track the movements of the wearer’s upper body in three dimensions. It is comprised of generic spectacles with built-in micro sonar.

Saif Mahmud, a doctoral student in information science, said“What’s exciting to me about PoseSonic is the potential for its use in detecting fine-grained human activities in the wild. Having lots of data through body-sensing technology like PoseSonic can help us be more mindful of ourselves and our behaviors.”

This is the first team to use inaudible acoustics and AI to track body poses through a wearable device. They combined AI into low-power, low-cost, and privacy-conscious acoustic sensing systems. Also, they used less instrumentation on the body, which is more practical, and battery performance is significantly better for everyday use.

Two pairs of tiny microphones and speakers, each around the size of a pencil, are mounted to the hinges of eyeglasses by PoseSonic. An echo profile image is produced when inaudible soundwaves from the speakers reverberate off the upper body and back up to the microphones. The machine learning technology at PoseSonic uses this image to determine the body posture almost precisely. Additionally, unlike previous data-driven wearable pose-tracking systems, PoseSonic performs well even without requiring the user to undergo an initial training session.

To determine head posture, the system can estimate motions made at nine different body joints, such as the nose, wrists, shoulders, elbows, and hips.

Compared to current wearables, which frequently need a tiny video camera, which isn’t always practicable, the technology is a significant advancement. According to scientists, current wearables with video cameras also have a high energy consumption and raise privacy issues. Compared to wearable cameras, acoustic sensing uses ten times less power. As a result, the technology allows for a wearable that is more smaller and less noticeable.

Journal Reference:

  1. Saif Mahmud, Ke Li, Guilin Hu, Hao Chen, Richard Jin, Ruidong Zhang, François Guimbretière, and Cheng Zhang. 2023. PoseSonic: 3D Upper Body Pose Estimation Through Egocentric Acoustic Sensing on Smartglasses. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 7, 3, Article 111 (September 2023), 28 pages. DOI: 10.1145/3610895

Newsletter

See stories of the future in your inbox each morning.

Journal

Trending