Digital phenotyping uses our smartphones to detect anything from Parkinson's disease to mental health disorders. But is our privacy at risk?
For starters, I forgot the alphabet.
It was one of those weeks where I missed every deadline already — unable to focus on work or anything else. I was instructed to quickly touch alternating letters and numbers in order: 1A, 2B, 3C, 4D, 5E, 6… What is the next letter? The stopwatch ticked on, and I imagined my multitasking score dropping. (Oh well, multitasking isn’t a thing anyway, right?)
I’m a volunteer for a study that uses the BiAffect app to screen my virtual mental health. Created by a University of Illinois team, BiAffect is a phone app that monitors mood and how it impacts cognition. The study is built around the theory that personal devices, like fitness trackers, phones or smartwatches, act as a digital proxy for human behavior.
This emerging scientific field is called digital phenotyping. It takes data stored in personal devices — typing patterns, speech, usage, or movement — to diagnose anything from Parkinson’s disease to mental health disorders and depression. The field is only a few years old, but it could be on the way to transforming healthcare.
Smartphones may be best-suited for the task of digital phenotyping, given how widespread the technology is. Today, the majority of Americans own a smartphone, and millions of people are already using them for things like tracking sleep and exercise.
We’re finding more and more that these digital signatures are related to behavioral indicators of health and cognitive abilities.
Imagine walking into the doctor’s office: the nurse takes your temperature, blood pressure, and syncs your cell phone data — all metrics the doctor will use to assess your health. For Raeanne Moore, a psychiatrist at University of California San Diego, that future might not be too far from reality.
“Oftentimes, the smartphone is the first thing we look at when we wake up in the morning, the last thing we touch before we go to bed. We’re constantly interacting with them,” she says. “Everyone has their own unique digital signature. We’re finding more and more that these digital signatures are related to behavioral indicators of health and cognitive abilities.”
A University of Pennsylvania study showed that digital phenotyping can even help predict diabetes.
Still, the momentum in this field is primarily with mental health. Currently, there aren’t any effective biological tests for mental health disorders. You can’t get a blood test to find out your risk of schizophrenia, an MRI to diagnose bipolar disorder, or a genetic test for ADHD. But our behaviors on our phones are linked closely with how our mind works. Studies have shown that language used on social media could help predict suicide. Speech and breathing patterns detected by phones can be early indicators of opioid overdoses.
Traditionally, to diagnose most mental health disorders, doctors rely on patients self-reporting past symptoms or taking pen-and-paper cognitive tests. But Moore says those tests are unreliable and can be costly. That’s where digital phenotyping can fill in the gaps.
In an initial study, the BiAffect team found that changes in typing habits, such as typos or typing speed, could be used to predict manic or depressive episodes in users with bipolar disorder. Another study found that small changes in speech, as monitored by a smartphone app, could also give insight into the mood. Instead of self-reporting symptoms, this real-time passive monitoring of behavior changes could go a long way to diagnose mental health disorders early — often before people even recognize symptoms.
“Most people aren’t diagnosed (with Alzheimer’s disease) until they’re very symptomatic. If we could identify people very early in that process, the available treatments are found to be most effective before someone is overtly symptomatic — to get them on medications early and improve their quality of life and hopefully stave off the decline for longer,” Moore says.
Moore stresses that integrating new technology into the healthcare system takes decades. Still, the current coronavirus pandemic is fast-tracking research in this field.
“At this time, funding is becoming more and more available to scale up this work and help people like myself and other researchers in this space,” she says.
On March 17, the Trump administration relaxed restrictions on telehealth for seniors to reduce the risk of people catching or spreading the novel coronavirus in doctors’ offices. And new apps that track coronavirus or assess the users’ risk of catching it are quickly being developed.
One challenge that proponents of digital phenotyping will need to address is a broad concern that digital phenotyping will normalize attitudes about surveillance. Already our phones collect a considerable amount of information: Apple tracks the number of times you hit the home button, and how time is spent on each app. GPS is monitoring movement. And there is social behavior data, and data people leave behind via cookies, which acts like breadcrumb clues across the internet. This all creates a unique digital signature for each person, which raises the prospect of using digital phenotype research to surveil people’s private digital behavior.
Public research organizations operate with a high level of trust because they work within strict frameworks regarding how they can use information, says Kit Huckvale, Research Fellow in Health Informatics at Black Dog Institute. In a clinical setting, there would be little interest in leaking participant’s data. But in a commercial setting, he believes the rules may be a bit blurrier. Beyond that, consent agreements and the risks of data sharing are challenging to understand — possibly more so for people with mental health disorders.
Digital phenotyping is only a few years old, but it could be on the way to transforming healthcare.
“There’s a whole opportunity for research in that area, as well. How do you help people make informed consent choices about this new world of digital data? I don’t think we have necessarily great answers to that yet. But we know it’s a problem,” he says, while acknowledging that the most likely consequence of your data being used in commercial ways is the annoyance of targeted advertising — something many people have already accepted.
Victoria Smith, acting CEO of CompanionMx, a commercially available app, says that their app is HIPPA compliant — meaning they follow all the same rules and regulations as hospitals with collecting and storing data. CompanionMx is not actually regulated by the FDA as a medical device because it doesn’t do any type of diagnosis or treatment, only monitoring. And like most commercial or research apps, it asks for consent to collect and use the data.
“We measure depressed mood most of the day — avoidance of people, lack of interest or pleasure in activities, and fatigue. Those things are all measured through a combination of voice and passive smartphone sensing,” says Smith.
Other commercial apps like Minstrong or Spire Health also monitor users in real-time. They log physical symptoms like pulse, voice tone, or activity level, to monitor changes in mental health, or even intervene by connecting the user with a human counselor.
“Privacy,” is a precarious balance for someone whose work is on the web and heart is shared on Facebook. I joined the BiAffect study out of curiosity: about my data, about digital phenotyping, and about my own privacy threshold. But, in this wild new world of a global pandemic, job loss, lockdowns, and a one-room apartment for my family — I can’t help but feel the weightiness of our collective mental health. We’ll need creative solutions to address gaps in mental health diagnosis and support. And I’m ok with a few more targeted ads because of it.