'Empathetic Technology': Can Devices Know What You're Feeling?

Islamabad (Pakistan Point News / Online - 14th April, 2019) For some, the word "technology" might evoke cold imagery of steely robots and complex computer algorithms. But a talk on "empathetic technology" at this year's Wired Health conference did a lot to change this perception.With approximately 39 million people in the United States currently owning a smart speaker, technology that caters to our needs is more and more ubiquitous, taking up ever more of our personal space.

But smart devices can do so much more than merely playing our favorite song or searching the internet when we ask them to. Smart speakers may soon be able to diagnose us or tell how we are feeling.At Wired Health an annual conference that brings to the fore the latest developments in health tech neuroscientist and technologist Poppy Crum, Ph.D., gave a talk aptly titled "Technology that knows what you're feeling."Treading a fine line between ominous and hopeful, the title made a powerful point: soon, consumer technology may know our mental and physical states before we do.

But how, exactly, can technology achieve this? How can we harness its potential to help us elucidate mental and physical conditions, and what role does empathy play in all of this?These are some of the questions that Crum answered at Wired Health an event which this year took place at the Francis Crick Institute in London, United Kingdom.Practical applications of empathetic tech"Empathetic" hearing aids could be personalized and attuned to the amount of effort that a person with hearing problems needs to use in order to make out what someone is saying, said Crum in her Wired Health talk.

This would help destigmatize those living with certain disabilities, as well as providing these people with optimal care.Empathetic technology also has wide implications for our mental wellbeing. "With more capable cameras, microphones, thermal imaging, and exhalant measuring devices, we can capture prolific data," writes Crum, data that can, in turn, function to alert carers.On the subject of mental health, it is not only the eyes that offer a window into someone's "soul," but also the voice, Crum expounded in her Wired Health talk.

Researchers have applied artificial intelligence (AI) to data they gathered on parameters such as syntactic patterns, pitch-reflex, and use of pronouns to accurately detect the onset of depression, schizophrenia, or Alzheimer's disease.For example, less than a year ago, Tuka Alhanai, a researcher at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at the Massachusetts Institute of Technology in Cambridge, MA, led scientists who designed a neural network model that accurately predicted depression by analyzing speech patterns in 142 participants."The model sees sequences of words or speaking style, and determines that these patterns are more likely to be seen in people who are depressed or not depressed [...] Then, if it sees the same sequences in new subjects, it can predict if they're depressed too."