Remember this movie HerAbout a boy who built unhealthy relationships with AI? Apparently it is now happening in real life, and Sam Altam Chat counts it for GPT use “cool”.
On Sequrea Capital AI ASEANT Earlier this month, Sam Altman, CEO of Open, answered many questions about Chat GPT, including some people who gave rise to some serious trends about the use of AI chat boats.
During the question and answer, the sound of the commentary comments, the AI -Tech Bruce has apparently disconnected that they do not look more dependent on chat boats as a problem.

“They don’t make life’s decisions without asking Chat Jaked,” Altman claimed, when asked for examples of “cool” use for AI in youth. The Ultman added that the Chattagpat “has full context on every person in his life and the one he has talked about.”
Altman confronted the dark scenario, saying that people aged 30 and older were using Chat GPT as much as Google alternatives. Although it is still somewhat unintentional. Spit the fully -folded or misleading responseThis is not the same burning ground as you trust on a chat boot to determine the path of your life.
These comments may look stupid at a glance, such as making fun of the use of web MD instead of going to a doctor. But if it is more reliable on AI, we should all be upset.
Finally, AI is not a person and cannot basically replace the original human contacts or understand one’s living experience. It can do all that is imitated and reorganized.
This means that even if an AI knows the “complete context” of every person in your life (which is a nightmare of privacy), it may not be possible to understand that it is not to solve an argument, go beyond the breakup or deal with a disappointing partner. It has no concept of relationships or emotions, just through the Internet’s mountain of training data through the codes wires.
Of course, we all use Google to seek or research advice, so you will be forgiven for thinking using a chatgot. However, when you are looking for suggestions through Google, you can at least see where the advice is coming from or you can go to Reddutt, where you are Can be quite believe that you are getting input from real people.
Even if you are familiar with the dangers and be careful how you use Chat GPT, there are many real risks about unhealthy, even dangerous relationships with AI.
For example, in a case Reported Rolling stoneAfter a woman ended her marriage, her husband had to get a madness after starting a passion for conspiracy theories, which causes her to comment on her that “the whole thing feels like a black mirror.” Here are countless other stories that people are extending their lives on the strange spiritual messages of AI Chat Boats.
In another horrible instance, Parents in Texas are tried on the CharacterWho is the chat boats that they claim to “hurt themselves, encourage violence and provide sexual content to their children?” According to the legalization, interactions with Character Dot’s chat boats involved inappropriate sexual content and even encouraged a child to kill his parents when he tried to reduce his screen time.
This type of case is disappointing, disturbing and unfortunately the tip of the iceberg. These chat boats can create messages that look like a real person or friend, prepare children to struggle to distinguish between relationships with real people and conversation with chat boats.
It is easy to see how this conversation can one day rely on a chat boot to make all his life’s decisions, or even struggle to create a strong friendship because they are talking to AI about their life rather than a real person.
To say all this, more and more people turn to AI for advice is not as “cool” as Sam Altman apparently thinks it is. Our fracture, polarized Internet is already the result of separating people from each other. We need to talk more than each other, not AI.