Final Exam – Sounds of Soul

Previous

Impact: The Power of Research

Strategic Connections – Partnering to Improve Patient Care

Next

The Language of Loneliness

Stevens researcher K.P. Subbalakshmi teamed with IBM and two universities to spot signs of alienation.

Alienation is a major issue in the U.S., and the situation has only worsened since the global COVID pandemic. Several recent surveys suggest one in every three U.S. adults and young adults feels lonely regularly.

Now Stevens artificial intelligence expert K.P. “Suba” Subbalakshmi — working with IBM, the University of California San Diego and China’s Jiangnan University — has developed an AI-based system that predicts loneliness in older adults from personal interviews. The technology may help us better understand the roots of people’s alienation, too.

The team surveyed about 100 residents of a continuing-care senior housing community ranging in age from 66 to 101. Their states of mind were assessed in two ways: through personal interviews with a trained psychiatrist and by using the UCLA Loneliness scale, a self-reporting questionnaire. Next the team transcribed audio of the interviews and fed the transcripts, along with the questionnaire data, into a specially designed AI model that quickly “learned” to tell the linguistic differences in responses between lonely and non-lonely people and then to offer predictions. Presented with interview transcripts and no other context, the AI proved 89% accurate at correctly classifying whether people would report feeling lonely or not.

That’s not all. Certain components of the new AI helped the researchers better understand the nuances of participants’ answers. These layers and portions of the algorithmic system, known as explainable AI (XAI), reported back on which sections of each interview were most important in the AI’s final predictions about whether a person felt lonely or not.

XAI revealed, for instance, that an increased use of emotional adjectives and verbs in verbal responses to the interviewers tended to indicate a person felt lonely — as did increased references to religion, and highly analytical responses. An increased use of personal pronouns also seemed to point to loneliness.

“XAI can play a crucial role not only in identifying individuals at risk of loneliness, but also in understanding the loneliness itself,” says Subbalakshmi.

That’s important, she says, since it’s now believed there are various subtypes of loneliness that require personalized interventions.

“Loneliness caused by actual social isolation might be remedied by building more and stronger social connections,” Subbalakshmi suggests. “But loneliness experienced by a person surrounded by friends and family could stem from a different cause, requiring a different intervention.”

This proof-of-concept study, she continues, indicates XAI can help professionals separate out and elevate the important clues and cues in conversations with older adults in order to better help them recover emotional wellness.

The study, supported by both IBM Research and the National Institutes of Health, was reported in Psychiatry Research.

– Paul Karr

Research Briefs

Giving AI a ‘Sense of Touch’

AI is good at many things, but feeling and measuring tiny surfaces and distances like humans can isn’t one of them.

But a Stevens team has leveraged quantum technologies to teach AI to “feel.” Professors Yong Meng Sua and Yuping Huang and doctoral candidates Daniel Tafone and Luke McEvoy ’22 M.S. ’23 devised a quantum-lab setup that combines a pho-ton-firing scanning laser with AI models.

The team tested 31 industrial sandpapers with surfaces of varying roughness; the system’s accuracy proved comparable to the best industrial profilometer devices currently in use. The new method could be useful for a variety of applications, including differentiating between harmless skin moles and potentially fatal melanomas and quality control of manufactured components, say the researchers.

Grappling with Gridlock

Gridlock is a huge issue in urban areas, particularly so during emergencies as evacuations occur.

Now two Stevens researchers have developed a system for predicting the jams in real time, aiding planners and emergency management.

Professor Mohammad Ilbeigi and recent graduate Mina Nouri Ph.D. ’24, working with the University of Florida, used New York City-area traffic data pulled from 2012’s Hurricane Sandy and a technique bundling the region’s 10,000-plus segments of roadway into chunks.

A network-based approach, which models interdependencies among the road segments by taking into consideration whether several adjacent zones are flowing smoothly or degrading, gives a better grasp of the problem.

The duo developed computational models and trained them on traffic data from 700 million New York City taxi rides, focusing particularly on rides around the Sandy disaster. The analysis uncovered the locations of two critical bottlenecks and revealed the worst slowdowns occurred in two spikes, one a day before the storm struck and another during the second and third days afterward.

– Paul Karr