Wearable systems nowadays have numerous uses. Even, various scientists are developing such systems by adding major advancements to it. Similarly, MIT scientists have developed a new artificially intelligent, wearable system that can detect conversation’s tone. This system can predict if a conversation is happy, sad or neutral based on a person’s speech patterns and vitals.
This conversation tone detecting wearable system uses deep-learning techniques. By using these techniques, it provides a sentiment score for specific five-second intervals within a conversation.
Scientists developed this system with privacy strongly in mind. Its algorithm runs regionally on user’s device like it protecting user’s personal information including passwords.
A graduate student from MIT Tuka Alhanai said, “We are on the way to the world where people will have an AI social coach right in their pocket.”
“This system can analyze audio, text transcriptions and physiological signals to determine the overall tone of the story with 83 percent accuracy.”
During the first experiment with the device, scientists collect both physical and speech data in a passive but robust way. Through results, scientists concluded that it’s possible to classify the emotional tone of conversations in real-time.
To add more organic emotions, scientists asked subjects to tell a happy or sad story of their own choosing. The asked subjects to wore this research device. The device captures high-resolution physiological waveforms to measure features such as movement, heart rate, blood pressure, blood flow and skin temperature.
Likewise, scientists captured 31 different conversations of several minutes each. For that, they used two algorithms on the data. The first algorithm detects the complete nature of the conversation. At the other hand, the second algorithm detects each five-second block of every conversation as positive, negative, or neutral.
Scientists said, “The system’s performance would be further improved by having multiple people in a conversation use it on their smartwatches, creating more data to be analyzed by their algorithms.”
Alhanai said, “The system picks up on how, for example, the sentiment in the text transcription was more abstract than the raw accelerometer data. It’s quite remarkable that a machine could approximate how we humans perceive these interactions.“