How Machines Can Help Identify Suicidal Behaviour

Follow us onFollow Tech Explorist on Google News

Machine learning gives the computer the ability to learn without programming explicitly. Various scientists are taking advantage of it.

Similarly, Scientists from the Cincinnati Children’s Hospital Medical Center suggest that new computer tools can identify a person’s suicidal behavior.

This new study shows that computer technology, known as machine learning, can classify 93 percent accurately a suicidal person. It can 85 percent accurately identify a person who is suicidal, has a mental illness but is not suicidal, or neither.

Professor John Pestian said, “Such advanced technology provides us with a strong evidence about machine learning.  By using it as a decision supporting tool, clinicians and caregivers can identify and prevent patient’s suicidal behaviour.

Internet addiction may indicate other mental health problems

These computational approaches provide novel opportunities to apply technological innovations in suicide care and prevention. It surely needed. When you look at health care facilities, you see tremendous support from technology. But not so much for those who care for mental illness. Only now are our algorithms capable of supporting those caregivers. This methodology easily can be extended to schools, shelters, youth clubs, juvenile justice centres, and community centres, where earlier identification may help to reduce suicide attempts and deaths.

To test the tool, scientists involved 379 patients from emergency departments and inpatient and outpatient centers at three sites in the study. The patients were suicidal, diagnosed as mentally ill and not suicidal, or neither. They were serving as a control group.

During the study, scientists asked patients some questions. The questions were like: “Do you have hope?” “Are you angry?” and “Does it hurt emotionally?”

While answering the questions, scientists analyzed patients’ verbal and non-verbal language from the received data. They then used machine learning to classify the differences with higher accuracy.

Additionally, they noticed that the control patients laughed more during interviews, sighed less, and expressed less anger, less emotional pain, and more hope.

Up next

New AI model imitates sounds more like humans

Teaching AI to communicate sounds like humans do.

4M: a next-generation framework for training multimodal foundation

An open-source training framework to advance multimodal AI.
Recommended Books
The Cambridge Handbook of the Law, Policy, and Regulation for Human–Robot Interaction (Cambridge Law Handbooks)

The Cambridge Handbook of the Law, Policy, and Regulation for Human-Robot...

Book By
Cambridge University Press
Picks for you

Unexpected magnetism in atomically thin material discovered and explained

New ultrathin conductor promises more efficient, cooler electronics

A scientific framework for operating the Nile’s mega dams during prolonged...

Organic thermoelectric device generates energy at room temperature

Revolutionary robotic shorts enhance walking efficiency in elderly