When facing mental health issues, many are turning to AI like ChatGPT for therapy. We prefer to use this Apple application

oriXone

When facing mental health issues, many are turning to AI like ChatGPT for therapy. We prefer to use this Apple application

Apple, application, ChatGPT, facing, health, issues, mental, prefer, therapy, turning

That generative AI is on the verge of becoming humanized is a reality. In fact, ChatGPT comes very close. So much so that it’s scary. This is why many patients suffering from mental health problems are turning to chatbots for comfortsomething the Washington Post discussed in a recent article.

It addresses several real cases of people who They prefer to use AI rather than going to specialists. Something that, to some extent, is understandable given their histories, but which represents a dangerous precedent. Our advice: an Apple app and therapy. And not because Apple is magical or better than ChatGPT.

“Paying ChatGPT is more profitable than paying for therapy”

That of paying for a subscription to a chatbot is a real testimony with which The Washington Post closes its report and which perfectly sums up what many of the people interviewed think. They are mostly people suffering from depression, anxiety or post-traumatic stress for the loss of a loved one.

One of the most tragic and frightening cases is that of a young man, only 14 years old, who committed suicide after becoming romantically involved with a chatbot. Something which also pushed his family to sue Character AI, the company behind the chatbot with which the teenager fell in love.

There are also cases in which a loved one is lost and a chatbot is used to seek to express the same as that person. There are many advanced models that can “transform” into a person by replicating a connection with the person on the other side and, with their own guides, express themselves in the same way that loved one did who is no longer there.

We have a new clue to depression, we found it in an unexpected place and most Apple Watches have a function to detect it

Coming back to the question of using ChatGPT and other AI as therapy, some feel more comfortable. tell your problems to the chatbot as if it were a psychologist. This is able to respond with a series of guidelines that can help the person, but it is in no way something that is recommended.

In fact, there are psychologists and other experts who debate the risk of people trusting chatbots that are not clinically tested. These are powered by information from the Internet which in many cases can be verified, but not always. And that’s not to mention the hallucinations that even Apple warns against.

The best advice: “Health” and professionals

Apple Health
Apple Health

If we extrapolate all this to the Apple ecosystem, we can clarify that there is an application that can be vital to know how we are doing in terms of physical health (if we have an Apple Watch) and also in terms of mental health. This is precisely the application called “Health”.

It’s Apple’s native application which, if you don’t know it, collects a lot of advice from professionals to improve our health in general. In addition, since iOS 17 it includes a mental health record for the iPhone in which we can quickly and easily note how we are doing. By virtue of this, they will advise us.

Neither walking nor running: this is the exercise that experts recommend practicing at least 30 minutes a day

Of course, even though this application is created by experts in the field, it still remains a little help in the end. The best way to combat mental health problems is to consult specialists. In fact, going to therapy is something experts advise even if we seemingly feel fine. Because no, we are never in perfect mental health.

In Applesfera | Sleep apnea on Apple Watch: what it is, how it is detected and how to treat it

In Applesfera | How to take an electrocardiogram with your Apple Watch

Leave a Comment