Transcript edited for clarity
Learning Objectives
After the webinar, clinicians should:
1. Understand the risks and benefits of apps
2. Lead a patient through informed decision-making around picking apps
3. Spot red flags and any potential safety issues in these technologies
Welcome to this presentation titled Navigating the Sea of Mental Health Apps. I'm Dr. John Torous, an assistant professor at Harvard Medical School and Director of Digital Psychiatry at Beth Israel Deaconess Medical Center in Boston with no relevant conflicts of interest to this content. The learning objectives are to understand the risks and benefits of apps, specifically mental health; to lead a patient through informed decision-making, around picking apps; and finally, to spot red flags in any potential safety issues around using apps or related digital mental health tools and technologies.
To get started, the first question is how many apps exist and where are patients finding these apps? There's no exact number, but we did an estimate a couple years ago of about 10,000, and that number has stood the test of time. It's clearly not an exact number. It's going to fluctuate, but certainly it's more than a thousand. It's less than 20,000.
Why are there so many of these apps? The main reason is the barriers to entry are relatively low. It's not like developing a new drug. It's developing an app and submitting it to the Apple or Android commercial marketplaces. And this is getting increasingly easier, especially as a price of developing an app continues to go down and become less. Your patients are likely actually being directly advertised these services if they're on social media. They're probably even sometimes getting mailed brochures about them. There there's a lot of direct to patient or consumer advertising that in the end is again, directing people to download these things from iTunes app store or to Android app store.
What can we learn from this sea of apps that are on the stores? Our goal in this talk is not to endorse any specific app. It's also not to say any specific app is bad or we would not recommend it. We're bringing up apps as educational examples without any implied endorsements, but in some sense, if we're making sense of these apps, it may also begin to think broadly about what a mental health app is. And as you can see in the figure of a circle, apps that help people with diet could improve mental health; apps that help with fitness, self-care, and sleep could certainly improve mental health.
There's a whole category of apps that offer access to videos or synchronous telehealth, ie the ability to communicate with a clinician, and those can be a type of mental health app. It doesn't have to be that classical CBT app when we think about what a mental health app is. If we look at the figure in pink, you will see mental health app types:
- chatbots
- educational apps
- self-improvement solutions
- teletherapy applications
- mood and symptom tracking
It's worth taking a broad view of what a mental health app is and does and instead thinking about it beyond classically, say just a CBT app, that someone could want in terms of how to make sense of it all.
How to make sense of apps
The FDA is beginning to offer approvals of some apps. This was a paper that came out this year and it looks at many apps that have received different types of approval. The reason I put a question, mark is this paper actually raises a question that asks, “Is attrition actually going to be different outside of these studies?” When we use these apps in the real world, when we don't have the support of a clinical study, do these apps actually offer the same rates of engagement? Do people actually stick with it or do we need to really reconsider what it means to be say an effective app or a useful app?
At this point, my sense is if an app has FDA approval, that's not a bad thing, but it doesn't really mean it's necessarily better than a free app that could be on the marketplace. It doesn't mean it's going to work well for your patients, it just means they've gone through the process. And I think as we'll talk about even the FDA is learning a little bit about what makes an app say more effective, more sustainable, and has a more durable impact. I personally wouldn't pay more for an app that has FDA approval. I would say at this point that some companies have wanted to put products forward, but as we'll look at a lot of these products have very similar profiles or offering similar things and it's unclear an FDA stamp means any better. I would personally say no.
Understanding patient needs
I think before we even talk about apps, we could take a step back and ask, how do we make sure these technologies are accessible and equitable to all our patients? How do we consider something like digital literacy? How do we help our patients access technology if they don't have access to phones? What are resources clinicians can learn to help support patients improve their digital literacy skills? If we consider just a basis of digital literacy, not even digital health literacy, or digital mental health literacy, as we're talking about.
Digital literacy 101
Think about just how many things now in the world require use of technology. I think this was a wonderful figure from NPJ Digital Medicine that came out where the authors of the paper said that this may be a new social determinant of health. In the top, right, you can see we do need to access health, but think about how you need technology, to access education today. It can be necessary for community and social support, for neighborhoods, utilizing your physical environment, economic opportunities, even accessing safety net programs for food and basic resources. And I think one of the quotes from the paper is very important for us to keep in mind:
Research shows that nearly one half of older adults and 30% of those earning less than $30,000 [a year] own a smartphone and many low-income households share devices, raising both access and privacy issues.” -Cynthia J. Sieck and colleagues
If we focus on the access issue, what this is telling us is that many people with fewer resources actually may be what we call smartphone dependent. This means the smartphone is their primary device that they use to access to internet or not using a computer. This data comes up in different populations. You look at Medicare populations, we again see that smartphones are becoming the primary device that many patients are actually using to access all of their digital health services.
We're not saying computers aren't important. They have a very important role, but if we have to focus on one device, it should be a smartphone. If people have smartphone, how do we know that they're comfortable and that they have the knowledge, skills, and confidence to use it as part of their health and mental health.
There's not a simple well-established scale on assessing digital literacy today. There are some that are out there. A lot of them are computer focused, but I thought this was an interesting one. It came out in 2022 and you can see it's called a Digital Health Literacy Scale. It really only has three questions.
- I can use applications/programs (like Zoom) on my cellphone, computer, or another application (without asking for help from someone else)
- I can set up a video chat using my cellphone computer or another electronic device on my own (without asking for help from someone else)
- I can solve or figure out how to troubleshoot basic technical issues (without asking for help from someone else)
Each question is scored on 0-4 from strongly disagree to strongly agree. The authors have not found a cutoff for the scale of what is below average. What is average? What is above average? But I think what's nice about this is it shows you that even with three questions, you can actually quickly begin to assess digital literacy or see if your patients may need a little bit of extra help somewhere in doing this.
And in a second, we'll talk about the Digital Navigator Program. That can be one solution that you could be offering to help patients who do need more assistance around digital literacy. You don't have to solve all challenges. Digital literacy is complex, but I think it's important that we at least screen and assess it. It could be just as simple as asking, “Are you comfortable with a video visit? If I offer you online prescription renewals, are you okay with that?” Many patients will be. But by asking, I think you may be surprised that some people would like a little bit more help.
Access
The data shows that most people today in America own a smartphone. It’s getting close to 90%, and the rates are not that much lower for people with mental illness, including those with serious mental illness like schizophrenia. There are very easy ways that you can help your patients get a device that they don't have. One, get a smartphone. Or get a low-cost data plan, so even if they're paying a retail price, the Federal Communications Commission (FCC) has the Lifeline Program for Low-Income Consumers.
This is something I've used with my patients. It works very well. You go to this website directly because there are some sites that have similar names that are not it. It will link your patient after a quick verification to wait to get a very low-cost phone and a very, very cheap data plan again, to the point where I've had patients who've been able to save $50 a month, which really adds up by qualifying for a federal government plan.
There are ways to support your patients to get basic technology that are not hard, that will not take a lot of clinical time. Assuming we can help people get devices, we've assessed digital literacy, and we’ll talk more about helping people get skills.
Patient privacy
What apps are out there? As we’ve said, there are thousands of them. Which ones are good? Which are bad? Are they protecting data? This is a figure from a consumer reports, the investigative magazine that looked at these apps in 2021. This was not something we did. They basically asked questions, such as:
- Is it clear where data is shared within the privacy policy or the last question?
- Can you opt out of research?
- Does the app share data ONLY with companies named in the privacy policy?
You can see overall, some apps were getting those yellow circles of yes, but there were a lot of Xs and question marks. A year later, the Mozilla Foundation, the people that in part built a Firefox browser wrote this report, and you can see the headline is “Top Mental Health and Prayer Apps Fail Spectacularly at Privacy, Security.” What's interesting is you just type in “Mozilla Foundation/Mental Health Apps” [into search] to pull up the report. As apps have responded to the critiques that the Mozilla Foundation has had around privacy, the Mozilla Foundation is updating [the report]. You can actually go look at it today and figure out if an app you're interested is in on this list or not. Ideally, it doesn't have an exclamation mark for privacy concerns around it.
Exploring the evidence
We want to look not only at privacy but also the evidence, and perhaps it sometimes feels like these complex parking rules. We're still learning exactly what we could do. Sometimes like parking rules, it feels like things are changing. We're always a little bit uncertain, but we need to realize using apps to diagnose or treat mental health conditions is a relatively new phenomenon. We have to respect that the evidence is still evolving. In that light, we have to respect that's where it is and not overjudged it, but not assume that things are ready to go and will work well today.
One interesting related story of this evidence is on the top left where we have a screenshot from a paper of an app called Focus, which helps people with psychosis-related disorders around different symptoms from medication adherence, to mood, to symptoms related to schizophrenia. It's an impressive app that has had numerous studies in prestigious journals with some pilot efficacy data. However, the app was changed after Novartis bought the rights to a version of it. It's clearly not the same thing.
Novartis actually ran a study with Pear Therapeutics, and again, not exactly the same as Focus, but you can see the headline in the 2022 paper that came out: A Smartphone-Based Intervention as an Adjunct to Standard-of-Care Treatment for Schizophrenia: Randomized Controlled Trial. What's fascinating about the study is they actually used a digital control group that was randomized to this app and a different group was randomized to click an app icon that would say you have 10 days left in the study. You have five days left in the study. It was really just a countdown timer. As you can expect, a group that got this impressive app that had these features, their psychosis scores got lower. Their anxiety scores got lower. Their depression scores got lower. That's great. What's interesting is the group that got the countdown timer had the exact same results. Their depression scores got lower. Their psychosis scores got lower, and their anxiety scores got lower. So the question is, what does the evidence mean in that case? When the control group in some ways is doing just as well as the intervention group?
I think in part it tells a similar story to a meta review of a meta-analysis that I was part of. It was led by Simon Goldberg that we published this year. And the point being you can see there's a lot of studies we found that we're comparing the mental health app to an inactive control pink. When you compare it to an inactive control group, the effect size of the app looks rather large. You could argue some medications have an effect size of say 0.5 or 0.6. But when you compare it to an active control group, the effect size green you can see is that there's less, and we're perhaps seeing the effect size is not zero, the apps aren't hurting people, but they're more in a smaller range. The point is as we get higher quality science in the scientific sense, and the studies get more sophisticated in their methodology and we have control groups that are more appropriate, I think we'll get a better picture of when apps work, who they work well for, and we'll probably move the standard to having an app compared to treatment as usual or nothing. If we have issues or concerns today with looking at a stopwatch on your phone offering mental health benefits, we should be concerned as there are also other studies showing that playing Tetris can offer benefits equal to mindfulness. To really understand if and how apps work, we should be comparing app interventions to things that apps can offer to people today. I think it's a little bit hard to evaluate the evidence because we don’t have many studies offer higher quality methodology in comparing an app to an appropriate digital control condition. I would say if someone shows you an app compared to treatment as usual or an inactive control group, I would take that as a first step and say you're very excited to look for their next study that offers even more positive evidence.
Engagement challenges
We also have to think about engagement. Like giving someone a course of CBT, we want to make sure it's possible for patients to go through it. If we offer a medication, we want to make sure that the medication will be tolerable and they can stick with it. If we offer someone an app, we want to make sure they keep using it, it is something that they can turn to, and they can get a useful outcome out of it.
If we think about engagement with apps, we have to be realistic and think about how people just in America use apps. This is older data from 2016, but what's interesting is this is not looking at mental health apps, it's looking at all different categories of apps—productivity apps, photography apps, communication apps, video apps. The main point is a lot of people download apps, but if you look at the percentage of active users, it quickly falls off. By 10 days, we're down to say 10%, and it keeps decreasing. The question is, is that the same for mental health apps? The answer is roughly, yes.
This is research from Amit Bamul. It was a very nice paper where they looked at apps that had over 10,000 downloads (so perhaps more popular apps you could use by that metric). You can see overall the rate of engagement with apps, people stop using them at around 10 days, so it’s not much different in the general population, looking at any type of app. You can see on the curve on the right the apps that were doing peer support actually did a little bit better. They didn't quite buck the trend, but certainly engagement is challenging. I think this tells us if we just give our patients an app and say, “Use it,” we're likely going to suffer an engagement curve like this. I think recognizing that engagement is a challenge, we are going to have to clinically integrate apps, and make them part of our treatment or treatment plans if we want them to be useful and have higher rates of engagement.
As we think about how to integrate apps into care, it's going to be different for your setting, the patients you work with, what type of clinic or group practice or private practice you work in. This was a nice paper that looked into the therapist and therapeutic experience around using apps, blended apps in a care setting. In some ways it's worth thinking about these nontechnical issues. What are facilitators? What are the barriers? In some ways, a barrier can be whether the app makes your care more efficient or less efficient? Is it going to give you more tools to use? You can automate different parts of it and you can refer to expert resources. Is it going to be too hard to access all those resources? Is it going to offer you a lot of flexibility? Or is it going to limit you because it is putting you on a different track? And is it going to allow you to tailor it or is it going to take so much extra skill using it that it actually leads to trouble?
What I'm trying to highlight here is there are barriers and facilitators to anything you want to implement, but it's worth thinking of those beyond. Any specific apps you're looking at, these are broad principles to keep in mind with it.
Digital Health Navigator
One thing to help with implementation is something that I've been working on with our team at Beth Israel is the Digital Health Navigator (DHN). We recently partnered with a project called SMI Advisor, which is supported by a grant from the Substance Abuse and Mental Health Administration (SAMSHA) and administered by the American Psychiatric Association (APA). A DHN can be a new member on the health care team. It can also be a clinician or a front desk staff, or someone on your team who has a little bit of additional training and expertise around using smartphones. They helping patients use Apple and Android smartphones, select apps, and stay engaged with [their devices]. They also preview and summarize some of the app data.
We're about to release completely free online training around digital navigators. If you go to https://smiadviser.org/dhn, you can sign up and you'll get an email notification relatively soon when it's ready. You can also learn more about what a DHN is to see if that is a role that you can use again, to help implement apps. If we go back to digital literacy, that is something you can use to help support your patients in gaining some digital literacy skills. The DHN is a very pragmatic and practical. The training is 10 hours. Some parts may be easy if you have digital skills, but it's something I'd encourage you to look more into.
Digital therapeutic alliance
Considering if we want to implement these apps and we know that engagement in some ways necessitates us to do this, it's also worth thinking about the digital therapeutic. How may an alliance impact your patient? We know that a higher therapeutic generates a predictor associated with better outcomes. If we can in some ways extrapolate and inquire about what an alliance be with an app? It's not going to be the same as an alliance of a human being, but it may help us think about the goals we're using an app for and the whether the tasks the app is using appropriate or is the right steps for the patient. What type of support is the bond? Is the app helping my patient move towards a goal? Is it engaging? Is it offering support if they run to trouble? Is this where I have a digital navigator help the patient go through it and supply information they’re missing? But it it's worth thinking about the digital therapy alliance. These are questions related to it:
- I trust the app to guide me towards my personal goals [Goals]
- I believe the app tasks will help me to address my problem [Tasks]
- The app encourages me to accomplish tasks and make progress [Bond]
- I agree that the tasks within the app are important for my goals [Goals]
- The app is easy to use and operate [Tasks]
- The app supports me to overcome challenges [Bond]
Risks and benefits
A lot of what we've talked—from thinking app privacy, to the evidence for apps, to engagement or ease of use, to the clinical actionability, to integration of the alliance—it's actually summarized in this pyramid, which is reflected in the American Psychiatric Association's App Evaluation Model. That framework can easily be found online. There's a website for it, completely free. It has videos, examples, and more details of how you can approach an app through this framework. In a second, we'll talk about a related project our team has done, not part of American Psychiatric Association, called mindapps.org, which is a database that lets you apply these principles to search mental health apps today.
Once again, if you’ve thought of an app or there’s an app you're interested in, there also are a lot of ways to deploy it. You could tell your patient you're going to use this in a “self-help” way. We have to be a little bit worried about engagement. A “guided” way is where you could have many patients using the same app; helping nudge them forward; and when they run into trouble, giving them support, encouragement, and with a guided version, you could be working with patients one on one. How can you use an app outside of sessions to augment, to extend care in a “hybrid” manner. And you could be doing a combination of all of these with any apps. There are a lot of different examples of how you can use an app and how you can deploy it.
One example of how we're perhaps using a little bit of all three is in our Digital Clinic Model at Beth Israel Deaconess Medical Center. What we're doing is between sessions, we're having patients record their symptoms, as well as behaviors or sleep, on a smartphone app. We're also having them practice self-help skills, mindfulness skills, that are related to their care.
We ask people in the clinic to practicing things between sessions, and we're having a digital navigators help by driving engagement, making sure people can set up the app and collect this data, do these exercises, as well as in summarizing the data for the telehealth visit.
And then we talk about that data and say, what did we learn from the trends and symptoms? What did we learn? Say about sleep, did the exercises we assigned help. And then we repeat the cycle with new information. New surveys, new resources, new app interventions, each time. So in some ways we're using a digital navigator. We're having people do self-help apps, we're bringing it into care and it can be from a clinician point of view, an exciting way to practice because I can come to my sessions with my outpatients and have a lot of fasting data to talk about with them. Not all data is going to be accurate. Not all data is going to be useful.
Sometimes people are filling out scales as a way to show me that they're not doing well. Sometimes they may be filling out skills and it does reflect what's happening, but it really provides a useful launching point for starting a clinical session and working directly with people. To implement this, there's a digital navigator, there’s a clinical session we're having, and then there's the app in our case, it's the mindLAMP app that our team has developed to support tight integration into our digital clinic model. But of course, there are many ways you can deploy these apps, and I think that's what makes it so exciting.
Mindapps.org
We started this talk with the fact that there are perhaps 10,000 apps. We've built a website called mindapps.org, and it's an index of mental health apps. We have about over 700 related apps. It's the largest, publicly available, completely free database of mental health apps in the world. What we do with mindapps.org is on the left, you can see different filters of costs. You can expand that out of developer type and supported conditions. I've here expanded out different engagement features that an app could offer. And the idea is you would go through with your patient or you would ask your patient to go through it, and you would check off what you’re looking for. Maybe you want an app that has video and peer support. You would check those two. And in real time, the database will then say, here are the apps have those two features, and this is a different view of it. We have 105 questions that this could scroll across for a long time, but we're very transparent on what all the data is.
You can choose all the apps that meet specific criteria. So as example, you can look at how many apps are totally free, have a privacy policy, and offer CBT. You could go to privacy policy and see what my apps will do. It'll only return those apps that meet those conditions. And you could say, well, there are still too many of them. So, show me the ones that don’t share my data with anyone in the privacy policy, or you could go to show me the ones in conditions that support eating disorders, and it would begin to narrow down [the results]. Eventually you can get it to the point where there are no apps, but you or your patient can quickly use this to put what features are important to you and see what returns into the data. We keep it updated every six months for every app. It takes a dedicated team of volunteers we have running this, but I think it provides a very useful hands-on resource that you can use to explore apps and see which ones may be of interest.
See what's out there. Look at what the free options are there before you pay for an app or ask your patients to pay. See if you can find one that costs absolutely nothing. Search for apps by privacy policies. You can search for ones by types of evidence they have. I'm not here to tell you what to look for and I'm not here to make assumptions on what you need or what your patients need, but consider Mindapps which is in some way is a database that you can interactively search and learn about what is out there in the app space. Thank you very much for joining and learning more about apps with me.
Earn CME for watching the webinars with a Webinar CME Subscription.
REFERENCES
Kalinin K, May 7, 2022, https://topflightapps.com/ideas/how-to-build-a-mental-health-app/
Nwosu A et al, Front Psychiatry 2022;13:900615
Sieck CJ et al, NPJ Digit Med 2021;4(1):52
FCC. Lifeline Program for Low-Income Consumers. https://www.fcc.gov/general/lifeline-program-low-income-consumers
Comstock J, January 13, 2021 https://www.mobihealthnews.com/news/novartis-trial-shows-no-benefits-pears-schizophrenia-app-ceo-cites-trial-irregularities
Goldberg SB et al, PLOS Digit Health 2022;1(1):e0000002
Baydia A, March 15, 2016 https://dazeinfo.com/2016/03/15/android-apps-uninstallation-usage-retention/
Doukani A et al, BJPsych Open 2022 Jul;8(4).
Henson P et al, Lancet Digit Health 2019;1(2):e52-e54
__________
The Carlat CME Institute is accredited by the ACCME to provide continuing medical education for physicians. Carlat CME Institute maintains responsibility for this program and its content. Carlat CME Institute designates this enduring material educational activity for a maximum of one-half (.5) AMA PRA Category 1 CreditsTM. Physicians or psychologists should claim credit commensurate only with the extent of their participation in the activity.