Why We Love Shrink Apps
WHY YOU SHOULD CARE
Sometimes you just need a cold shoulder to cry on.
By Leslie Nguyen-Okwu
I was going through some dark times: Work was piling up, my future was crumbling, the world was ending. “But how does it all make you feel?” asked the polished blond woman whose taut face wore a saccharine smile. My shrink. I shifted, uncomfortable, in the brown leather chair. This was her therapist shtick and that was my first and last session ever. She was stiff and synthetic. She might as well have been a robot.
In theory, psychologists are supposed to heal you through divorce, death and daddy issues. Then there’s practice. Some secrets you wouldn’t divulge to your diary, let alone your shrink. Sometimes their good intentions get lost in all the psychobabble. Often, you just don’t click with the person to whom you’re paying hundreds of dollars per hour. And by the time you’ve educated him or her on your life history and your panoply of neuroses, the sunk costs hurt — and keep you there. It’s time for a different way. What if your therapist were manufactured for the job — not fickle fleshed but ironclad? In other words, a robot.
Cozying up to a tough-as-nails bot could create a whole different kind of transference.
The future of robo-therapy is not as far-flung or far off as you think. There’s Nao, a knee-high humanoid that communicates with autistic kids who shun most human interaction. And BlabDroid, a squat, cardboard-framed robot that asks you deep, dark questions in a childlike voice. In fact, social robots can work surprisingly well in child therapy, sex therapy and the treatment of depression — sensitive areas that are shattered by inappropriate human responses, says Joanne Pransky, a self-dubbed robotic psychiatrist. (Her specialty: the feelings that automation evokes in human beings.) These robo-shrinks, removed from the biases that might sway a psychiatrist’s diagnosis, could offer a listening ear without unmerited bias. Even better, by acutely analyzing your body language and facial expressions, they can detect traits about you that a regular ol’ human might miss.
Plus, cozying up to a tough-as-nails bot could create a whole different kind of transference. It’s called the ELIZA effect, which underscores the curious ways in which people ascribe emotion and intention to a soulless computer program. Some humans will indeed spill their guts to a robot just as they would to other humans, says Sabine Wilhelm, Massachusetts General Hospital’s chief of psychology. Maybe the transference would be even healthier and more empowering: Unlike a flesh-and-bones therapist, the robot comes from a “place of submission rather than dominance,” points out Alex Reben, who brought BlabDroid to life.
However, you shouldn’t ditch your therapist for a robot just yet, argues Catalia Health CEO Cory Kidd. He would know — his San Francisco startup creates interactive health care robots for patients of all stripes — and Kidd believes that the future of social robots lies in helping doctors “extend their reach more effectively,” rather than running them out of jobs. After all, is a robot even capable of discerning when to break the seal of trust when suicidal thoughts, child abuse or murder plots come their way? And the more philosophical question: Can a robot shrink replace the need for human warmth that most of us crave, especially when it comes to matters as intimate as our mental health? Unlikely, says Kidd. “It’s not a good thing across the board.”
But at least a robot can patiently listen to you and your problems 24/7, with a constant — albeit cold — shoulder to cry on whenever you need them. Let’s just hope it won’t charge by the hour.