1.2141715-935060408
Dr Tara Wyne, Clinical Psychologist Image Credit: Supplied

Dubai: With technology integrating itself into people’s lives with increasing finesse by the day, there are very few domains left untouched by it. The trend of individuals seeking counselling or therapy for a range of emotional and psychological stresses through artificial intelligence (AI)-driven chatbots is catching on, too, and psychology experts can see both sides of the counselling coin, so to speak.

Gulf News speaks with Dr Tara Wyne, Clinical Psychologist and the Clinical Director at The Lighthouse in Dubai, to shed light on how human emotions require a varied approach that perhaps may not always be the forte of technology.

Can technology such as chatbots come to the aid of humans in the field of counselling?

It’s true a group of psychologists from Stanford have developed a chatbot that is called Woebit and [it] can engage with clients and track their mood, have chat conversations on [their] current state and experience and also use word games to help people in distress. AI will influence every area of our life and psychotherapy is no exception. It’s still very early in the research phase, so we have to be cautious in our predictions as to the impact it can have in the field of mental health.

I wouldn’t rule out the role chatbots can play. The founding fathers of psychology and psychotherapy may never have predicted that therapy could be conducted indirectly over Skype or even envisaged telecounselling; however, these mediums for therapy are now widely accepted and therapeutically effective.

Chatbots may be able to provide support services in rural populations or even for underprivileged and low SES [socioeconomic status] populations, providing much-needed connection and support for people who have no other access to services or support.

However, there is no true comparison in quality and depth between a human therapist and a chatbot.

A real person provides validation, acceptance, empathy, compassion, containment [and] support.

[People] can analyse and provide feedback, challenge and use their own experiences to help support the therapeutic goals. They can judge when the client is too vulnerable to talk and offer ways to make them feel safe and secure; human therapists will always be more effective because they choose to feel and care and support, they aren’t following a programme.

They are moved by the client, believe the client, their genuine reactions and responses to the client provide corrective emotional experiences and facilitate healing.

A human therapist can also assess risk and seek help appropriately for the client, which a chatbot may miss or not respond to due to the nuances and subtlety of communication and language.

A chatbot can do some temporary holding and responding and support but [offer] nothing of true depth. I do not believe they can heal wounds or trauma, which a human therapist is trained to do and is confident and competent [enough] to do.

Why is this trend attractive to individuals?

Conventional therapy is often inaccessible by many in the population, either the hoops are too great to jump through to access government-funded support, or its unaffordable.

Many people also cannot commit to the regular consultations due to their commitments and schedules. Chatbots definitely overcome and solve [such] issues. They are readily and constantly available and don’t have the time boundaries of a conventional therapist, and can certainly be lower in cost.

People also report being able to open up and disclose more because they don’t fear judgement [from] the inanimate chatbot.

Why would people want impersonal advice from chatbots?

Human beings crave connection and being heard. Having some kind of relationship and feeling [of being] cared for is second in line to breathing, according to Harvard psychiatrist and researcher George Vaillant.

If people cannot access conventional psychological services ... if they struggle with social anxiety and find relationships stressful, they may find great ease with a chatbot which overcomes these difficulties.

3) What are the advantages and disadvantages of seeking counselling through such devices? Please list and explain them.

Advantages

Convenience, any time, any place, can perhaps access a chat bot outside normal working hours when real life crises occur.

Accessibility

It’s cheaper and let’s people from many backgrounds and cultures access therapy. It can be provided in multiple languages, which can cater to the global citizen better and prevent minority’s from being denied access to mental health services.

People in war zones and natural disasters may never find therapists due to the dangers and instability, chat bots don’t have these issues and can be placed in any setting.

Anonymity

People may not want to walk in to mental health clinics or settings due to fear of discrimination and the stigma attached to seeking psychological or psychiatric support. Chat bots can be placed in innocuous settings to avoid this.

Also the chatbot bot being a real person prevents the fear of judgment and all the normal time need to build interpersonal alliance to be able to provide support.

People seem to reveal more and faster when they aren’t afraid of being judged. Therefore the therapeutic benefit may be accessed faster.

Disadvantages

Chatbots cannot truly assess or respond to risk or vulnerability in a client and may place clients at higher risk of self harm or suicide. By opening up conversation and asking in depth questions and not picking up on time or body language.

They can not go out with their program, they can’t pick and choose from a wide range of therapeutic models to give the client the very best approach. This will be a very binary kind of therapy, without sophistication.

Chatbots can’t analyze nor can they challenge or be directive. At times the client needs more than just CBT, psycho education and check in.

Chatbots can’t share personal experience and help normalize the experience or suffering of clients, they can only provide standard two dimensional advice.

Chatbots cannot attune to a client: At the heart of any relationship is attunment. Great therapy is mirroring the client, their emotion, facial expressions, body language, words and tone, making them feel truly and deeply ‘felt, understood and accepted’.

Chat bots can’t provide corrective emotional experiences. If harm was done to a client through their relationships, the chat bot won’t be a legitimate source of healing, it can only do some limited level of symptom management.

Chat bots cannot respond to a crisis. They aren’t designed to be able to navigate the clients very specific problem and offer support and solutions. They can’t contain and hold the client emotionally.