Gadgets

Will Your Future Smart Device “Red Flag” You?

Dariusz Sankowski on Pixabay.  Modified.  Used with permission.

Source: Dariusz Sankowski on Pixabay. Modified. Used with permission.

There is a recent push for new mental health strategies to prevent violence and other social ills. One method being explored is new technology innovations such as Mental Health Apps (MHAs), which offer new opportunities to reach patients and address risks. But what rules and strategies must emerge along with the advent of MHA technology?

Mental health apps have been available for some time, as mentioned in a prior article. The first-generation MHAs mostly provided reminders and positive messages, which could be helpful for mindfulness, sleep hygiene, life / illness management, and skills training. Unlike human therapists, digital mental health apps are available 24/7. Besides providing journaling prompts and inspirational messages, mental health apps also collect passive self-report data. User responses are kept in a database and analyzed to provide feedback.

New-generation MHAs integrate biosensors and devices such as smartwatches, phones, or sensor pads to monitor fluctuations in the user’s daily signals. The latest devices record data: from physical activity to sleep data, skin resistance, temperature, blood oxygen levels, ECG, fall detectors, and even emergency medical alerts. These body-worn devices provide automatic monitoring of readings and activity to lessen the burden of patients having to input the data. The newest MHAs crunch all that bio-psych data using algorithms to identify trends, and employ AI to provide feedback. In the near future, they will likely also offer preliminary diagnoses and even treatments. For example, your future MHA biosenses an unusually high-stress reading and perhaps recommends a wellness checklist or relaxation module. You engage in a conversation with your AI therapist, and your device lets you know when your metabolism returns to a healthier level.

But questions remain: Where is the use of mental health monitoring data going in the future? What guardrails are needed for mental health data collected by MHAs and digital devices?

Several steps can be considered:

  1. Psychologists must validate the accuracy of MHAs. Consider the consequences of misdiagnoses, false positives, or false negatives. Beta testing an app is not as thorough as conducting clinical trials.1 Clinicians can partner with engineers and software developers to make MHAs more accurate, safe, and effective. The future of digital therapeutics requires clinical trials on efficacy and consumer education about uses and abuses of new technologies. For example, some researchers conducted trials of internet-based cognitive behavioral therapy for diagnoses of depression and anxiety.2 Such well-controlled research is needed for the use of MHAs and body-worn sensor data to build acceptance and accuracy.
  2. Rules are needed for how MHA data will be shared. Will user data go to digital mental health records? Will this data be able to provide patients greater risk assessment and access to treatment? On the other hand, how or when will mental health data be used to “red-flag” those considered a risk to themselves or others? What will be the procedure to get a second opinion, or question your AI-based diagnosis? How can users remove a red flag if an MHA algorithm determined it was appropriate? Strict user permissions and privacy protections are crucial for the new digital mental health records frontier, especially if we want patients to adopt and use the new technology.3
  3. MHAs will eventually evolve towards providing treatments. In the future, perhaps a high-risk score will trigger MHA recommendations to seek therapy, or guide potential patients to mental health services. Soon, virtual mental health assistants might serve as confidential sounding boards, prompting users to divulge their problems, stories, and feelings. Perhaps some folks will prefer “therapy” with an anonymous, nonjudgmental robot? This will be the brave new future world of computer-mediated assessment and therapy. Innovation and testing are still needed, but great potential exists for these technologies to guide services to address mental health concerns.4

As MHAs gain acceptance, the developers and clinicians will have to consider establishing rules to protect user privacy. Circumstances in which MHA data might be ethically and legally used to enhance public safety should also be established. The key is to balance the privacy rights of patients and HIPAA compliance with the desire to identify and intervene during mental health crises.

Password: “Take a Balanced Approach.”

Leave a Comment