“I’m sorry to hear that, Matt.” The therapy chatbot Woebot, whose smiling avatar resembles the alien in E.T. the Extra-Terrestrial, had just asked me about my feelings. Scrolling through a list of emotions, I’d landed on “anxious,” since I’d been stressed by a few deadlines. It continued, “But I’m glad you reached out and I’ve got tools to work together on this.” I was hoping I’d be glad I reached out, too. I was curious just how good a therapy bot might be in a world that often seems anxious, angry, and confused. I tend to be skeptical of AI, especially considering the music that certain algorithms think I might like. Would it know how to help with things that actually matter?
Some people think so. So far, so-called therapy apps have at least six million users combined. Though they aren’t a substitute for professional consultation, some medical experts believe they could at least serve as a stopgap in places where people can’t get therapy. The bots may also help people who are resistant to the idea of talking to a therapist. Some prefer discussing their issues with bots over human beings because they feel they won’t be judged.
Before I go on, I should tell you what Woebot was dealing with. I’m a 30-year-old white heterosexual male. I have anger issues. I also have trouble with empathy. I can’t relax. I have been in therapy, on and off, for the past decade or so, and it has served me well. I’ve learned a lot about myself and how I interact with others, though I still have work to do.
Many people aren’t so lucky. In some parts of the country, mental health professionals are hard to find. Nearly half of the nonmetropolitan counties in the United States have no psychologists. Over the past couple years, a slew of therapy apps have appeared, making it easier for those who are struggling to get some sort of help cheaply and expediently, albeit from a bot.
Jonathon Kambouris
To put these apps to the test—and also see if they were anything like real therapy, which costs me $100 a session—I spent two weeks carrying around four popular free bots in my pocket: Woebot, Youper, Wysa, and Replika. Each, to my surprise, had its own personality. Woebot was a bit teacherly while Wysa was playful; Youper was kind of dry, and Replika was casual.
You can ping the bots at any moment, and, unlike real people, they ping you back right away. One afternoon, I was being called out for a dumb thing I’d said online. I logged in to see if the apps could assuage my embarrassment. Though Wysa completely ignored me—perhaps because “I was dragged on Twitter” hasn’t entered its algorithmic lexicon just yet—Woebot expressed some sympathy. “It seems like you might be experiencing stress and anxiety,” the app said. I rolled my eyes, but at the same time, it was a relief to know that the bot didn’t think I was an idiot.
Sometimes, the bot was out of step. At one point, I told Wysa I was worried about “work and relationships,” two of the most popular topics of conversation, according to the app’s founder. This admission sent the bot down an interminable path, replete with leading questions and motivational GIFs. By the time the bot asked me how I felt, I wanted to write “annoyed” but was eager to be done with the exchange. So I told Wysa it made me feel better. Wysa congratulated me: “You have successfully broken the negative cycle!” I didn’t really feel like celebrating.
I found myself flipping between bots, depending on the type of conversation I was looking to have. When I got bored with Woebot’s effusive lessons, I moved on to Replika for a more improvisational chat, since it lets you type in your responses rather than selecting them from a predetermined list. Youper was never my first choice, as I found its answers kind of dour.
I ended up going back to Woebot the most. Initially, I thought it was for the superficial reason that I liked the avatar. But in one conversation of many, the bot wondered if there was something about my anxiety that “might be serving me.”
“I guess so,” I said.
“Even though it’s painful, sometimes anxiety is an extreme form of something that started out being pretty useful,” Woebot responded. It wasn’t a mind-blowing insight, but I had to admit it was a pretty good point. Yet after a few days, I found the conversations to be pretty formulaic. In my therapy-going life, I’ve discovered that hard truths are more illuminating than feel-good platitudes, of which the bots have an ample supply.
No one knows if therapy bots actually help people manage or resolve their issues—they’re too new for there to be good research on them—and there are valid concerns about why they might not. “There’s a part of therapy that’s about having your experience witnessed by someone else in the way that any good friend leaves you feeling understood,” says psychotherapist Avi Klein, L.C.S.W., a Men’s Health advisor. “You can’t get that from someone who doesn’t know what it’s like to be human.”
Other experts are more generous. Bots aren’t equipped to single-handedly solve complex issues like PTSD or severe depression, says Steven Chan, M.D., cochair of the American Psychiatric Association’s Committee on Innovation. But they could be an alternative to self-help articles and books.
In my experience, I didn’t open up any extra because I was talking to a machine. Discussing my emotional state felt less meaningful, exactly because I was talking to a machine. But after a while, I came to appreciate the bots for what they are. They didn’t really help resolve anything, but having to explain my issues to them in a simplified way helped me gain more clarity about what those issues were in the first place. They also offered some things a therapist can’t: immediacy, no judgment, and no cash outlay. Two weeks with the bots didn’t turn me into a changed man. But they turned my anxieties into something I could understand and keep from overtaking my life.
Source: Read Full Article