A therapist and patient can tell each other a lot in an hour. But imagine if the patient and therapist also can keep in touch electronically when they aren’t together.

A stream of data from patients’ smartphones—on movement, sleep, screen time and phone use—could give therapists a more comprehensive picture of a patient’s state of mind, as well as an ability to intervene when a patient needs it.

Researchers...

A therapist and patient can tell each other a lot in an hour. But imagine if the patient and therapist also can keep in touch electronically when they aren’t together.

A stream of data from patients’ smartphones—on movement, sleep, screen time and phone use—could give therapists a more comprehensive picture of a patient’s state of mind, as well as an ability to intervene when a patient needs it.

Researchers are testing digital tools that could bring this vision to life, assisting therapists in managing patients’ mental-health conditions. They are designing apps that collect a continuous stream of data from smartphones, using voice-analysis software and parsing online-search behavior to better assess each patient’s condition, counsel them more effectively and predict crises. With these apps, mental-health specialists will be able to gather information about patients’ symptoms, help patients better understand the connection between their own patterns of behavior and mental health, and offer new tools to self-manage symptoms and practice therapy skills between sessions.

While there are plenty of mental-health apps on the market that aim to help patients cope, they aren’t generally designed to share information with a healthcare provider. The apps under development, by contrast, are designed to connect patients to doctors as part of a therapeutic relationship, with patients’ consent and active participation.

“We are not using technology to replace or be a substitute for care, but to augment it, by partnering with patients to bring valuable new data into a therapy visit and offer increased support for patients between those visits,” says John Torous, a psychiatrist and director of the Division of Digital Psychology at Beth Israel Deaconess Medical Center, affiliated with Harvard University.

Of course, this vision only works when patients are willing to give personal data to their therapist. There are serious questions about how to ensure informed consent and how to safeguard the information users provide. Clinicians also must be trained in how to use the tools efficiently. But large healthcare providers are taking interest in the efforts as they seek solutions to a continuing shortage of mental-health specialists and a growing demand for their services, which has only intensified during the Covid pandemic.

Here are some of the efforts that show promise:

Gathering daily data

In weekly or even less frequent clinic visits, therapists often don’t get a complete picture of what their patients are struggling with daily. One possible solution: collecting data from mobile phones to give doctors a continuous look at their patients’ health.

SHARE YOUR THOUGHTS

How do you think apps can be used to keep people mentally healthy? Join the conversation below.

Dr. Torous’s team has developed an app called mindLAMP that aims to paint a comprehensive picture of patient behavior and mental status. Using smartphone sensors, the app collects “digital biomarkers”—behavioral data such as screen time, sleep, steps and more—as well as gathering information from patients in the form of short surveys and cognitive tests.

In reviewing the data, doctors might spot that patients are doing fewer healthy activities related to exercise, sleep and mindfulness, and notice that their mood consequently changes. So, doctors might work with the patients to explore the relationship between mind and body and consider changes to lifestyle, therapy or medication regimens customized for their case.

In one study published earlier this year in the Journal of American College Health, 100 students at several universities took a series of mental-health assessments via virtual visits with Dr. Torous’s team, and then used mindLAMP for 28 days. At the end of the test period, the students had another virtual assessment. Results showed the smartphone data offered new insights into mental health, especially around how lack of sleep and exercise affected depressive and anxiety symptoms.

In a continuing study with 500 students, the team is now using these insights to predict when a student may be at risk and sending them personalized information and activities on the mindLAMP app to help prevent worsening symptoms. For example, if a student isn’t sleeping or has high scores on a depression-assessment questionnaire, algorithms can automatically recommend interventions from a group of mindfulness, cognitive therapy and behavioral exercises, or highlight patterns to be discussed in person at clinic visits.

Easing social isolation

For patients with serious mental illnesses like schizophrenia, prescribed medications help reduce symptoms like hallucinations, delusions, and confused or jumbled thinking and speaking. But they can’t help with the difficulty patients often have in establishing and maintaining social relationships.

During therapy, patients get help on how to manage relationship hurdles, but evidence suggests they could do better with continuing real-time support delivered to their phones at moments when they need it, according to

Daniel Fulford, a psychologist and assistant professor at Boston University.

Dr. Fulford and his team are developing an app called Motivation and Skills Support to provide targeted social-goal assistance. In a study published earlier this year, 31 participants with a diagnosis of schizophrenia who were on medication used the app for eight weeks, setting a social goal, such as making a new friend. They were sent notifications twice daily—including questionnaires about mood and motivation—as well as automated feedback about goal progress and encouraging messages. For example, when participants reported low motivation to work toward their goals, the app would remind them of positive social experiences they reported in earlier interactions with the app.

Overall, participants showed improvements in social functioning after the intervention, and their perceptions of their social interaction also became more positive. “People are reporting on their thoughts and feelings, and that in and of itself is an intervention,” Dr. Fulford says. “If a notification comes to your phone and suggests a coping mechanism or an opportunity for people to improve the way they feel, the phone can be a conduit or a window into direct mental-health care.”

As part of the research, Dr. Fulford’s team is collecting GPS and movement data from patients’ phones—collected continually with the people’s consent—as well as using their phones’ microphones to periodically sample audio and identify the presence and qualities of speech that indicate social connection.

For example, notable changes in a person’s movement or speech, such as prolonged periods of inactivity or lack of communication, could serve as a warning sign that the person is at risk of loneliness or isolation, triggering an intervention via the phone to increase social connection or support coping with difficult feelings.

Interventions like these—which are programmed to be delivered automatically, without a doctor’s direct supervision—can be effective at improving social functioning in schizophrenia, the study found. Dr. Fulford says the next step is to integrate such approaches into continuing care so therapists can personally prescribe apps and monitor activity between sessions to inform treatment.

Clues in the voice

At the University of California, Los Angeles, therapists are also using voice analysis to recognize patterns in tone or word choice that indicate a patient with a serious mental illness may be heading for a crisis, which may not be apparent at a clinic visit but could be picked up by machine-learning algorithms. Studies have used language analysis to detect suicidal thoughts, depression, neuroticism and cognitive impairment.

Researchers at UCLA’s Jane and Terry Semel Institute for Neuroscience and Human Behavior, created a system called MyCoachConnect, which uses an interactive voice-response app to collect patients’ reports about their state of mind and analyze their voices over time. In a pilot study published last year with 47 patients and their case managers from a mental-health clinic, patients were asked to call one or two times a week and provide two- or three-minute free-ranging answers to questions from a computer-generated voice, including, “How have you been over the past few days?” and “What’s been troubling or challenging?”

The research team used machine-learning models to analyze speech samples to capture factors such as the choice of words patients use and how responses change over time, as well as tone of voice. They found that the app’s ability to determine patients’ mental-health state was in line with how physicians tracked their patients’ well-being during the 14-month study period. The app was also able to use voice samples from a given day to forecast the health state of the patient on a subsequent day, although with more modest accuracy.

An app that enables clinicians to respond more quickly to the needs of patients in between visits, predict crises and help them proactively “is much better than looking at apps in isolation and disconnected from human-based in-person care,” says Armen Arevian, who directed the research and is currently leading Chorus Innovations, a company spun off from UCLA to advance the software platform for other digital health apps.

Dr. Arevian also collaborated with David Miklowitz at UCLA to develop a mobile app for use in a form of family-focused therapy developed by Dr. Miklowitz. In a pilot test, families used the app at home in addition to virtual or in-person therapy sessions, to practice communication and problem-solving skills. Participants included teens with active symptoms of depression, a parent with a mood disorder and at least one parent who was highly critical of the child.

Mental-health care providers had the app remind the families to work on certain goals, such as setting a discussion topic for the week and having the family practice active-listening skills. The providers also analyzed voice samples from the app, primarily to see what words parents used with their children, as a measure of communication style. The results were then reviewed in therapy sessions.

A study published last December in the Journal of Affective Disorders found that after 18 weeks of treatment, participating adolescents reported being less depressed and criticized less frequently by parents, and clinicians reported an improvement in adolescents’ mood symptoms. A continuing trial is examining if this technology enhancement to therapy is more effective than therapy alone, and researchers are investigating whether an algorithm that analyzes parent speech can help identify issues such as overly critical relationships with their children.

Searching for suicide risk

Technology could also be used to identify warning signs of suicide that may be missed by clinicians who see patients infrequently, and have no way to effectively monitor those headed for a crisis. While data in electronic health records—such as a patient history of substance abuse or trauma—has been shown to help identify who is at risk for suicide attempts, it hasn’t been able to pinpoint when an attempt might happen.

Now researchers at the University of Washington are investigating the use of online search-history data to better understand suicide risk and develop better detection and prevention methods.

The researchers identified search terms and queries such as “final divorce decree cost,” “fits of rage” and “why do I have so much anxiety” associated with known suicide warning signs.

In a study published in June in the Journal of Medical Internet Research, 62 participants who had been admitted to a hospital or had emergency care for attempted suicide allowed researchers to review their online search history for the year before the attempt. An algorithm used by researchers in the retrospective study identified about 63% of attempts that had been made, based on changes in search behavior and queries, as early as six months before the event.

Lead study author Patricia Areán, a professor in the UW department of psychiatry and behavioral sciences and a licensed clinical psychologist, says a potential use of the technology is to create alerts in electronic records if a patient is at risk in the next 30 days, allowing doctors to work with the patient to get them a change in medication, more frequent outpatient visits or perhaps an intensive outpatient program.

While participants reported that they felt the use of search data was potentially helpful, they expressed concerns about privacy and felt an algorithm predicting suicide risk would need to be close to 100% accurate in its detection, so for example, an alert from an online search would not send police to their house if they were simply looking for information and not really at risk.

Though more studies are needed to determine the usefulness of data from smartphones, searches and mobile apps for mental-health therapy, “if we can get this to work and engage patients to collect it and share it with us, this could really change up and enhance the treatment experience,” says Dr. Areán.

Ms. Landro, a former Wall Street Journal assistant managing editor, is the author of “Survivor: Taking Control of Your Fight Against Cancer.” She can be reached at reports@wsj.com.