Jump to content

California Tests Mental Health App That Tracks Everything Patients Do on Their Phones


steven36

Recommended Posts

The way we interact with our devices is to an increasing extent an extension of our innermost selves. It’s no wonder startups view the data gleaned from these behaviors as a window into our mental health—and the foundation for their business model.

 

https://s7d3.turboimg.net/sp/99cfe443ddaffb85645af704a153a6b8/5bfb.jpg

 

California mental health officials have been in talks with two startups about developing a digital system that identifies when a smartphone user is about to have an emotional crisis, characterizing it as a “fire alarm,” according to a report on Monday from the New York Times. There are reportedly officials from 13 counties and two cities involved in the development of the system, which is already being tested on individuals getting help from the Los Angeles County public mental health network.

 

Mindstrong, a mood-predicting app, and 7 Cups, an online therapy service, have reportedly been working with state officials on the system since last summer. Those participating in the trial let Mindstrong install a keyboard onto their phones which constantly tracks their screen activity. The company’s algorithm can determine the user’s normal activity with a week of data, Dr. Thomas R. Insel, one of the Mindstrong’s founders, told the New York Times. When there are multiple instances of divergent behavior from this established activity, the app will send a message to the user. It reportedly takes the company about a day to identify such a disruption.

 

According to Mindstrong’s website, the company uses “powerful machine learning methods to show that specific digital features correlate with cognitive function, clinical symptoms, and measures of brain activity in a range of clinical studies.” The digital features include the ways in which someone interacts with their screen—i.e. tapping and scrolling. There are also a number of other behaviors and activity Mindstrong might find useful to monitor, according to a number of company patents, which list things like the opening and closing of apps, character and voice inputs, touchscreen gestures, GPS, accelerometer, and gyroscope coordinates, incoming and outgoing calls, emails, and messages, books read on an e-reader app, and games played on apps.

 

A “few dozen people” reportedly had Mindstrong’s alternate keyboards installed on their phones last winter, but about half of those people are no longer using the keyboard functionality, citing loss of interest or technical difficulties.

 

“It’s been a little rough in the beginning, I have to say, and it may take a couple of years,” Dr. Insel told the New York Times. “The program may have to fail at first.”

 

While there has been a growing push among tech companies and medical professionals to figure out ways in which technology—specifically, artificial intelligence—can serve as a tool to identify and intervene with those struggling with mental health issues, there hasn’t been a system that has proven successful in the long-term. Just like California’s recent efforts with some of its residents, we’re seeing systems deployed in their trial phases on those arguably with the highest risks should something go awry.

 

And aside from the potential for the program to initially fail, as Insel himself pointed out, even if it works as intended, there are still some unsettling privacy concerns to consider. A patient is effectively allowing a tech company to surveil their every waking moment on their smartphone. The tradeoff, of course, is admirable—the company wants to provide the necessary resources during especially vulnerable moments. But it’s still unclear how effective an algorithm can be for someone at their most distressed, and in the meantime, the distressed are providing a tech company with its most desired asset—a wealth of deeply intimate data.

 

Source

Link to comment
Share on other sites


  • Replies 1
  • Views 415
  • Created
  • Last Reply

The next step is to install this app (or similar) without users knowing about it, and send a message to 'authorities' when the behavior diverges from the one established by the 'authority'.

Or is it already the case?

Link to comment
Share on other sites


Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...