Technology wants to be your mental health manager

Traditional therapy can be costly and sometimes inaccessible for many individuals.
Traditional therapy can be costly and sometimes inaccessible for many individuals.

Summary

  • While the age of AI has enabled plenty, digital shrinks aren’t just weird but potentially dangerous

The world now appears to believe that the application of technology to a problem can somehow find a solution which seemed impossible even a few years ago. For instance, despite the existence of online platforms for video conferencing and sophisticated ‘smart’ devices in patients’ homes or on their bodies, tele-medicine never really came into its own before the pandemic and boom in artificial intelligence (AI). This is simply because in-person interaction between doctors and patients is an integral part of the healing process. It is with this lens then that we must also approach mental health via the internet, since it is a specialty that necessarily requires person-to-person interaction.

Or does it? Psychologists have known for a long time that human decision-making is flawed, even if it is sometimes amazingly creative, and overconfidence is an important source of error in routine settings. Much of the motivation for applying technology (and specifically AI) in medicine comes from the knowledge that to err is human and overconfidence is an established cause of clinical mistakes. The latter is a human failing and it can influence our personal and collective successes and failures, doctors being no exception. That said, this was a view from a time before we had ‘large learning’ AI programs. We also know that biases can and do creep into AI tools. So, it’s biased tech versus overconfident humans for now.

The MIT Technology Review said in June 2020 that there was a 19-fold increase in the downloads of mental health apps early in the pandemic, and a 14-fold increase in those who said they were downloading these apps to relieve anxiety. The Review quoted John Torous, director of digital psychiatry at the Harvard-Beth Israel Deaconnes Medical Centre, who feels that these apps may in hindsight help mark a turning point where people increase their access to mental health care, but said that when these apps are used as stand-alone tools or single interventions, there is good evidence from meta-analysis that they are not as effective. While they may be used as adjuncts to therapy, the evidence suggests that therapy alone is more effective. That said, traditional therapy can be costly and sometimes inaccessible for many individuals. This creates a gap that apps can fill.

Mental health apps come in many forms. They range from meditation and mindfulness apps like Calm and Headspace, to mood trackers and apps designed for cognitive behavioural therapy interventions, such as Woebot. Some are designed to help manage specific conditions like anxiety, depression, or post-traumatic stress disorder, while others are general wellness tools. The role of AI extends to chatbots too. These bots, built on natural language processing, can simulate therapeutic conversations, offering immediate assistance when human therapists aren’t available. This bot bit is frightening to me, not just on account of Torous’s warning but especially because of the biases that can be in-built. Progress is inexorable though, and I believe we might start to see more acceptance of this sort of ‘simulated therapy’ as we go along.

The recently released Apple Watch OS10, the latest operating system for Apple’s wrist device, has almost on cue included a feature called State of Mind. Adrienne So of Wired magazine, in a review of the watch’s OS, goes into some detail on its relative merits and demerits (bit.ly/3Q7P5V7). She says, “Now, in addition to collecting images, we are being encouraged to log our very complicated, human feelings on the iCloud—regardless of whether you consider Apple’s reputation for privacy to be earned. (Apple seems to have taken a great many steps in this regard; however, I know better than to promise you that your data will never be leaked.)"

The app checks in on a regular basis to find out how you are doing. It’s not dissimilar to many advanced mental health apps which come equipped with tools that allow users to monitor their moods, thoughts and behaviours. With iOS17, Apple has also introduced Journal, a new iPhone app. Says Apple: “It helps users reflect on everyday moments and special events in their lives. To help inspire a user’s journal entry, personalized suggestions can be intelligently curated from a user’s recent activity, such as photos, people, places, workouts, and more, and scheduled notifications can help build a journaling habit." (apple.co/3tpYvmc). Integrating features like writing a journal, tracking moods and even biometric data (like heart rate or sleep patterns) gives users real-time feedback on their mental state. This feedback loop ostensibly enables users to identify triggers, understand patterns and take proactive steps towards better mental health.

With the proliferation of mental health apps, there has been a heightened focus on the ethics and privacy surrounding them. Users are entrusting apps with highly sensitive information. Recognizing this, state-of-the-art apps prioritize data encryption, anonymity and transparency in their data usage policies. Moreover, there is ongoing debate about the responsibility these apps hold, especially while dealing with users who may be at risk of self-harm or suicide.

As So says in Wired, “As someone who has struggled with anxiety for years, State of Mind is simultaneously inadequate and painfully intimate. Tracking your sleep is one thing, but it’s weird to take an assessment outside of a doctor’s office that asks you whether you’ve ever considered ending it all." To me, that’s more than weird. It’s dangerous.

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

MINT SPECIALS