Don't share a room or life with the virtual psychologist roomie! The artificial nonsense

Robot psychologists and psychotherapy with virtual “people” without empathy? I can’t even feel good when my Psychology students bring up this curiosity – not unanimously – although it’s easy to explain to them that it’s all a hoax just because it has the term Artificial Intelligence (AI) associated with it. That it’s a chatbot that spouts a series of memorized information without context. Everything that is AI seems interesting, it seems like it just appeared and we are lucky to be living in the supreme era. But the truth is that AI has existed since the internet began and since we sent text messages on our cell phones.
However, what really disturbs me is to think that there is scientific funding allowed for unnecessary and unethical projects like a robot psychologist called roomie . The name of the invention alone leaves something to be desired (translated from English, it means “roommate” or housemate without intimate relationships). If there was scientific funding and ethical approval for this project, science is lost and the foundations of Psychology are compromised. They want to evolve so much that they “translate” beyond what is acceptable. I use the verb form “translate” precisely to convey the sinister and fatal effect that an entirely invented roomie can have on the lives of people who allow themselves to be enchanted by the fallacious song of the siren.
In fact, now I have a glimmer of hope: a little while ago I stopped with my giant dog in the car (he was in the back seat) and several children approached him and insisted that he was a 'fake dog', a 'robot dog'. And they didn't want to touch him, because of that. But they were amazed. When I calmed them down and said that they could 'pet him' because he was a real dog, they all ran to hug the dog, because he wasn't 'fake'. I hope this happens with chatbots pretending to be therapists. They won't touch him, they won't trust him and they'll stay away from these ideas.
I read somewhere, because it is not worth citing such a source, that robot psychologists are good allies in understanding the patient's feelings during consultations. I have just written this sentence and nothing makes sense between words like 'allies', 'understand', 'feelings', 'consultations'... designed in a project that was approved for the robotic 'mind'. A mind that lies.
And speaking of words and lack of connection, as an expert in the psychology of language and related fields, I do not understand how they (renowned entities) considered this AI project to be a fruitful example of advanced language models. Language is and will always be primarily human because it is in this that the meaning resides. We can program and feed robots with our neural language programs, but we cannot want to be a miserly Icarus when it comes to mental health care.
Now, when I read obscenities about knowledge like a robot 'understanding feelings', I don't feel like going on and imagining what prescription the psychologist recommends to the patient or user. And, by prescription, I mean that it is not anything medicinal. For that, the financing of a psychiatrist robot must still be allowed.
Algorithms do not determine how to identify, treat and advise in the area of mental health. Algorithms do not contain any 'empathy molecule'. And how do you program empathic consciousness? I don't even need to feel threatened by my colleagues who, working as psychologists and psychotherapists, fear AI. You can't fear something that doesn't know how to take care of anyone. It was worse when I read that this Rommie wants to present himself as a specialist with ten years of experience and also (wants) to be able to work in psychoanalysis. We will have a holocaust of the human psyche by the end of 2025 ( Rommie 's expected date). This article is not just an opinion, but advice to avoid unreasonable AI fads. Don't look for roommates like this ' roomie ' because there is no saving on 'real' consultations.
sapo