Welcome to tomorrow. Artificial intelligence (AI) is now actual science, not science fiction. Although its formal inception took place in 1956, the idea of AI is known to most people only through imaginative movies such as The Terminator or the 2013 flick Her. However, right here and now, AI is real and maturing at a near exponential rate. Signs point to AI soon infiltrating society at large, which means that the counseling profession is not immune. The future of counseling likely involves virtual assistants, virtual counselors, chatterbots and, for the inclined, robots dubbing as animals to help comfort clients.
AI equates to machine learning. Current AI assistants such as Siri and Echo have limited capabilities. The holy grail of AI is artificial general intelligence — machines with humanlike, versatile abilities. AI can be contrasted with organic intelligence, or, put another way, human, biological intelligence. Many factors contribute to human intelligence, chief among them being our ability to process information, solve problems, adapt and learn. All of this happens in the brain, and in many ways, our brains are like computers. AI researchers apply the findings of neuroscience to computer programming to make computers more like us.
The goal of AI, then, is not just the production of an ordinary computer, but one that learns and can become autonomous. And guess what? Computers can learn much faster than us. Their intelligence is off the charts. Plot typical human intelligence quotients on a normal curve, situate Einstein’s a couple of standard deviations to the right, and try to imagine the placement of a conscious artificial intelligence (CAI). Now, envisage what a CAI is capable of doing, inventing, discovering and revolutionizing. The prospect is equal parts bewildering, intriguing and nerve-wracking.
Computers have already beat the best humans at chess, Jeopardy! and, more recently and impressively, the board game Go. Played copiously in Asia, Go is a strategic, intuitive game with a mind-blowing number of possible moves (researcher John Tromp finds that number on a 19-by-19-inch board to be ~2.082 × 10^170, which equals a 2 followed by 170 zeros). Garry Kasparov (chess), Ken Jennings (Jeopardy!) and Lee Sedol (Go) are all very smart, and each fell to AI in his respective specialty.
But those are games. AI can’t “beat” the best counselor, can it? Surely not …
Relevance to counselors
Asking if AI has significance to the counseling community is like asking if counselors should be concerned with global warming or if social media has an impact on the lives of our clients. AI is originating within the hard sciences but promises to touch the emotional lives of clients in untold ways. The magnitude of AI’s impact remains unknown. Some individuals are excited by the vast reach of AI, whereas others are cautious. Consider the following quotes:
“It would take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded. … [AI will be] either the best, or the worst thing, ever to happen to humanity” — Stephen Hawking
“We need to be super careful with AI. Potentially more dangerous than nukes.” — Tweet by Elon Musk, Tesla and SpaceX CEO and co-founder of Paypal
“I am in the camp that is concerned about superintelligence. [At] first, the machines will do a lot of jobs for us and not be superintelligent. That should be positive if we manage it well. A few decades after that though, the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.” — Bill Gates
Should counselors be concerned? They should at the very least be educated about the subject. Knowledge is power. In the short term, the consensus is that AI will rapidly expand automation. When this occurs, the jobs of your clients who are employed in, for example, the fast-food industry might be threatened. Perhaps new jobs will be created, however. After all, people originally feared that the Industrial Revolution would lead to massive unemployment. In fact, the opposite happened.
Unlike a century ago, however, things are changing at a faster pace. The modern age is quickly morphing into a future of omnipresent technology. Change management may become an overarching theme of therapy in the near future. Change is coming fast because algorithms are being written that enable AI to expand its abilities into the realms of creativity, cooperation and emotional intelligence. It is here that AI directly converges with counseling.
Computers that care
In his book, Thinking Machines: The Quest for Artificial Intelligence and Where It’s Taking Us Next, Luke Dormehl writes about how advances in facial recognition enable AI assistants to read the emotional states of users. The company Affectiva broadcasts on its website that its “emotion AI humanizes how people and technology interact.” Affectiva is at the forefront of bridging this gap by … let’s call it providing a corpus callosum between traditional computer acumen, with its mathematical and logical abilities, and the realm of emotional intelligence.
Facial recognition software is getting better. “Ellie” is an example. A virtual reality AI, Ellie was created by the Institute for Creative Technologies at the University of Southern California to help treat people with depression and posttraumatic stress disorder. On the computer screen, there sits Ellie, whose body language mirrors that of an actual therapist. She responds to emotional cues, nods affirmatively when appropriate and adjusts in her seat. She does all of this because her algorithm permits her to perceive 66 points on a person’s face and read his or her emotional state accordingly.
It’s obvious that Ellie is not “real,” and therein lies the secret to her success — people feel less judged talking to Ellie. She provides the ultimate in unconditional positive regard. Although Ellie looks like a therapist, she doesn’t claim to be one, telling people from the outset, “I’m just here to listen.”
Ellie has company in “Tess.” The developer, X2AI Inc., says Tess is a “psychological AI that administers highly personalized psychotherapy, psycho-education and health-related reminders, on-demand, when and where the mental health professional isn’t.”
This slogan speaks volumes about the future interplay of technology and mental health counseling. Counselors have families and need to sleep. Some even like to take vacations. AI has no need for any of the above.
As therapeutic AI becomes more mainstream, it is likely that some people will forgo seeing living, breathing counselors altogether in favor of their favorite virtual therapist. Others will see an actual counselor plus their online “listener.”
Of course, ethical questions abound. Nevertheless, like it or not, AI promises to play a greater role — either directly or indirectly — in the counseling sessions of the future.
A client’s truth
Skeptics may point to the obvious — that no machine is truly human; that humans need humans; that a machine can fake, say, empathy but not actually deliver it; and that clients will see through the façade.
The counterpoint to this criticism resides in a question: Who determines clinical truth? Rather than ask whether machines can be empathic, a more pragmatic question for counselors may be, will clients perceive them to be empathic? If so, what are the ramifications?
The evidence suggests that in some cases, people do indeed emotionally connect to computer programs. It has been happening since the 1960s, when “Eliza” was created. Created by a computer scientist to demonstrate the blurry threshold between man and machine communication, Eliza was a computer program that reflected statements typed to her via text. Programmers were astounded when people began ascribing human emotions and feelings to a computer program, confiding personal information to Eliza and pouring their hearts out. Blurry boundaries indeed.
Eliza is still with us, available on several websites and ready to chat. Programmers declare Eliza a Rogerian therapist open for business. You just have to believe the illusion. That illusion may be a client’s truth.
For the music therapists out there, AI has touched even one of the longest-running human traditions — making music. Sony Computer Science Laboratories is coordinating the Flow Machines project in conjunction with the European Research Council. The goal is to see if AI can autonomously create music.
Currently, AI still needs some human assistance. Your favorite singer undoubtedly has a better voice than your favorite robot. However, AI is helping and making great strides. Check out “Daddy’s Car” and “Mr. Shadow,” two pop songs created with the help of AI. The first is in the style of the Beatles, circa late 1960s. As for “Mr. Shadow,” listen and judge for yourself. Both songs are available on YouTube. Neither song may be suitable for music therapy, but their mere existence suggests that this is only the beginning of music created by sentient machinery.
The question is, if AI can help produce music today, will it find a place in the music therapy of tomorrow? Will the act of music production itself — between a counselor, client and AI — prove therapeutic?
Counselors aren’t the only ones interested in how far AI creativity will expand. For more information about how AI is being used to create both art and music, research Magenta, a project from Google Brain.
Meet Paro, a therapeutic robot. You may know a lot of robotic baby seals (who doesn’t?), but none is like Paro, because this cuddly seal is interactive. Paro (known as a “carebot”) makes eye contact, has five senses, responds to its name and, like any good AI seal, learns. Paro’s website (parorobots.com) indicates that research has shown that the carebot aids in reducing stress, improves relaxation, motivation and socialization, and helps people who have dementia. Paro certification classes are even available. If you are wondering whether Paro runs on batteries, rest assured that Paro charges by sucking on an electric pacifier.
Even our animal compatriots will be affected by AI. It won’t be the first time that technology has altered the function of an animal, or its numbers. The advent of the internal combustion engine spelled the end of the horse-drawn carriage. The number of horses in the United States plummeted as a result. It’s easy to predict that therapeutic robots will play larger roles in counseling. On the bright side, they create fewer messes.
Currently, counseling is chemistry, an interaction between two or more carbon-based life forms, albeit a special interaction marked by active listening. The therapeutic alliance is the emergent property that stems from this interaction. Chemistry and counseling — who said the social and hard sciences were disparate? This is counseling at an elemental level.
But what about counseling not at the basic level but at a technologically advanced level? What form does that take? AI offers an answer. Machines that can think and learn, that even look and act like a human counselor, could revolutionize the field.
The future is unwritten, but the counseling community would be wise to anticipate and plan ahead. Here are some pointers for doing just that.
1) Educate yourself about emerging AI technologies. Advancements happen quickly, so staying updated on everything might be impossible, but keeping an eye out for major breakthroughs, themes and patterns is advisable.
2) As AI infiltrates society at large, be on the alert for clients who are growing aware of the technology and feeling excited or fearful of it as a result.
3) Don’t be surprised when some clients start viewing a chatbot, carebot or — potentially — therapistbot as their other counselor. Likewise, clients may anthropomorphize their robotic pets. Start thinking about how you will respond when clients speak of computers as if they are people.
4) Advocate for your profession. Tech companies are producing everything from apps to robots, and they are hiring mental health professionals to help humanize their creations. The company mentioned earlier that developed Tess is currently looking for — you guessed it — clinical psychologists. Perhaps the people at this company simply don’t realize that counselors are distinct and have a lot to offer. Advocate.
5) Be proactive and address the ethics surrounding the coming AI movement. The choice is clear: Anticipate and plan accordingly or wait, be reactive and deal with issues after they have arisen. Prevention is good medicine. At the national level, the American Counseling Association’s Ethics Committee could keep AI on its radar screen.
Predictions about world-altering technology are usually premature, but AI shows no signs of slowing down. Sooner or later, AI will bring changes — perhaps significant changes — to the counseling field. The key is to adapt and evolve. Remember, no AI is better than the best counselor … yet.
Russell Fulmer is core faculty with the Counseling@Northwestern program with The Family Institute at Northwestern University, where he specializes in the psychodynamic approach. He has written “conversations” used in AI algorithms for chatbots. Contact him at email@example.com.
Letters to the editor: firstname.lastname@example.org
Counseling Today reviews unsolicited articles written by American Counseling Association members. To access writing guidelines and tips for having an article accepted for publication, go to ct.counseling.org/feedback.
Opinions expressed and statements made in articles appearing on CT Online should not be assumed to represent the opinions of the editors or policies of the American Counseling Association.
People should worry more about not understanding themselves (like becoming Nazi unknowingly) rather than worrying about AI can become harmful. Because we create AI, and AI would never have consciousness. If the human can create consciousness for AI, it’s like producing iPhone from a bunch of grapes, how ridiculous. We create them so we can control them.
Well, I don’t think AI is going to harm anyone.It’s the thing you can’t avoid then let it come.one thing for sure it is going to take over in the future.
While I do agree that we ought to be aware of the growing concerns AI counselors bring, I may fall into the group of “skeptics” who believes that there is a part of life that is necessary in the therapeutic environment that may not be replicated in totality by AI. Because of this, I think that AI may be better looked at as a potential addition to therapy in the future, not as the end of humanity as we know it. Perhaps, instead of taking over the entire profession, AI will become another helpful tool in the therapeutic environment.
Self awareness is the ultimate burden, what would be precious would be a switch that turns the inner dialogue aspect of the mind off, but retains the “will” aspect so that it can be turned back on
My relative is pursuing Phd in the field of A i and I would like to know if there are councilors available who could give her valuable advice as to how she could take her project further.They may charge for their services. Thanking you. P.Krishna S