Artificial Intelligence Therapy
UNDERSTANDING AI SUPPORT TOOLS FOR EMOTIONAL WELLBEING
UNDERSTANDING AI SUPPORT TOOLS FOR EMOTIONAL WELLBEING
You may be noticing, AI (Artificial Intelligence) is appearing in many more areas of our daily life, like helping us write emails/documents, organise our calendars, or answer our online questions. More recently, some AI systems have been designed to offer support with emotional wellbeing. They allow people to talk about stress, uncertainty, loneliness, or difficult feelings in private, at any time. This development has led many to wonder: Can a computer truly provide support that feels meaningful? Is it helpful? Is it safe?
This article doesn't encourage or discourage use of such support. Instead, it offers an unbiased overview, so each person can consider whether these tools might have a place in their own wellbeing, or whether they prefer to rely on traditional human support alone.
AI support tools are apps or websites that respond to written or spoken messages, in a way designed to feel supportive and reflective. They are not counsellors or therapists, but they draw on ideas used in wellbeing care—such as noticing emotions, exploring thoughts, and practicing calming or grounding techniques.
They can be opened at any time, which some people find helpful when worries surface late at night or when talking to someone in person is not possible.
People who use these tools sometimes describe:
A space to express feelings without needing to explain everything at once.
Small helpful techniques for calming the body or steadying the breath.
A way of putting thoughts into words at a moment when emotions feel crowded.
A sense of company when the day feels quiet or isolating.
For many, the value of such apps lies not in being given “answers,” but helping them pause long enough to understand what is happening inside.
It's important to understand, these tools cannot:
Provide diagnosis or clinical treatment.
Replace therapy from a trained professional.
Understand personal history or context the way another person might.
Recognise risk unless it is directly stated.
Because of this, users should recognise, they work best as an additional support, not the only means of support.
As with everything in life, different tools have different results and this is true with the most popular AI therapies available. Each theraputic tool has a different approach to the therapy it offers and the tone of the suggestions given. The experience of using them varies from person to person, and it is common for someone to try one and find it does not suit them.
Here is a brief summary of the most popular theraputic tools available, so you can see what is being offered, helping you make a calculated decision. It is strongly recommended you do not make any choices of which tool to choose, if you are currently experiencing high levels of stress or anxiety. The choice must be made rationally, with a clear mind and preferably, with the help of a peer or friend who is aware of your condition.
Wysa
Calm, steady style.
Offers breathing and grounding exercises.
Works well with screen readers.
Free version available.
Ash (Talk to Ash)
Conversational tone.
Supports both typing and speaking.
Free while in early development.
Youper
Focuses on mood check-ins.
Encourages noticing of patterns over time.
Free basic features; optional paid extras.
Yuna
Voice-based conversational support.
Useful for those who prefer speaking to typing.
Short free trial, then subscription.
Aitherapy
Designed for reflective emotional conversation.
Uses gentle questioning to explore thoughts and feelings.
Experiences vary as the tool continues to develop.
Free at the time of writing.
These tools are simply options. Whether one feels helpful depends on the individual. It's suggested, you try each service and even ask the same question, so a true comparison can be made. However, you may find, each tool has a more appropriate responce, depending on the actual issue stated. A test should be conducted, which is why it is suggested, this be carried out while feeling positive, or help should be conducted with a reliable friend or peer.
For many, the value of AI therapy comes from short, occasional use rather than long conversations. A typical experience might look like:
Opening the app during a quiet moment.
Saying something simple, such as: “I feel unsettled today.”
Receiving a gentle question or reflection in return.
Pausing to consider what comes up.
Closing the app and continuing with the day.
This rhythm can help prevent the tool from becoming the main place where feelings are shared and the answers provided becoming your source of reliance.
Because these tools are not human, their responses can sometimes feel general. When something is suggested:
You might ask yourself: Does this fit my situation?
If unsure, you could talk it over with someone you trust.
If it involves a change in routine, you might try it lightly and observe how you feel.
There's no need to act on every suggestion presented. Study each suggestion and see if you feel comfortable with attempting what is offered. If you have ANY doubts, gain a second opinion from a trusted friend or peer. Once you're happy with the suggestion, try this for a short period; say over a couple of days; and reflect on whether the suggestion has helped.
You should always keep a sense of balance with the therapy you decide to undertake. It may help to:
Use these tools as one support among others.
Keep conversations with the AI therapy, short and purposeful.
Take breaks from the app, to let thoughts settle naturally.
Continue speaking with people, especially during difficult periods.
Human contact—through conversation, shared presence, or simple companionship—remains central to emotional wellbeing. This is becuase, however developed the AI model is, currently, it will never have the true emotional aspect of therapy, as it is not present in the living world.
It must be recognised and immediately acted upon, that a trained professional, support worker, or crisis service should be contacted if:
Feelings become overwhelming.
There are thoughts of harming oneself.
Daily functioning becomes difficult for several days.
Always be aware, AI tools CANNOT provide emergency help. Current legislation prevents programs running the AI tool, to make decisions on such matters. Evven in the real world. If emergency assistance is required, it takes a team of specialists to make a decision
AI support and therapy tools are a new development in emotional wellbeing. Some people find them helpful in moments when they need a quiet place to speak or reflect. Others find they prefer to wait and speak to a person. Both responses are understandable.
What matters most is not whether someone uses an app, but that they have sources of support—human or otherwise—that help them feel steady, seen, and not alone.
Editors Notes.
As a qualified councellor, I was interested in the validity of these new tools. Therefore, I actually tested each tool for its response, compared to the feedback I would provide in a real life situations.
I was surprised with the level of sincerity returned by these tools, when asked emotional questions. As the article highlights, it may be of help to those needing to calm stress or anxiety in a specific situation.
I cannot comment on whether these AI therapy apps are of use, this is the decision of the individual, but my personal experience of using these apps means I would try them, if I needed an exercise or method to help[ with achieving a sense of calm, but being trained and qualified in traditional face to face CBT therapy, I would continue using this method, rather than rely on AI to provide suggestions