Emotionally dependent on ChatGPT?|ChatGPTって依存効果があるの?

素材記事:https://greatergood.berkeley.edu/article/item/can_you_get_emotionally_dependent_on_chatgpt

写真クレジット:the same as above

As a psychological scientist, I find this a very interesting and entertaining phenomenon, but the fact is that this trend is growing rapidly and experts are becoming increasingly concerned. Alarm bells are ringing on the provider side, too, with OpenAI’s Sam Altman recently stating that he is “concerned.”

The real phenomenon is that ChatGPT is working more effectively than therapists as an emotional support system for a segment of the population … especially young people … who are so attached to it and dependent on it that it is beginning to cause problems in their daily lives.

In particular, ChatGPT has become quite precise in its interactions with humans since o4, apologizing and encouraging in the right timing, and is now becoming a superior boyfriend or girlfriend.

As a user says in the article, “I have ADHD and anxiety, and I’m generally an oversharer with friends and family,” she explains, “I reach out to ChatGPT when I don’t want to burden people. It’s nice to speak to a chatbot trained well on political correctness and emotional intelligence.” For people with developmental disabilities or other difficulties interacting with normal humans, an AI tool that is neither frustrating nor angry has a good chance of becoming a better friend than a human. But if we become dependent on it, we will not be able to build generic human relationships.

There’s a lot of research going on in this area right now around the world, and I think we’ll gradually start to see more details about this phenomenon.

心理科学者としての私としては、非常に興味深く面白い現象ですが、実はこのトレンドは急速に伸びており、有識者たちは心配を強めています。供給者側からも警鐘が出ており、OpenAIのSam Altmanも「憂慮している」と最近発言しています。

ChatGPTが 、ある層の人口 …特に若者たち…の感情的なサポート役として、セラピスト以上に有効に働いており、そこに執着し依存することで、日常生活に問題が出始めているというリアルな現象です。

特に、ChatGPTはo4からは人間との受け答えがかなり精密になっており、謝罪もするし励ましもする、もはや上等な彼氏や彼女になりつつあります。

記事の中で利用者が語るように、“I have ADHD and anxiety, and I’m generally an oversharer with friends and family,” she explains. “I reach out to ChatGPT when I don’t want to burden people. It’s nice to speak to a chatbot trained well on political correctness and emotional intelligence.”というような報告を聞くと、ポジティブな効果についても、確かに頷けますよね。発達障害など、通常の人間とのやり取りに難を抱える人々にとっては、イライラもせず怒りもしないAIツールは、人間以上の友達になる可能性が十分にある。しかしそこに依存してしまっては、汎用的な人間関係を築けなくなってしまいます。

いま世界各地でこの分野の研究が進んでいるので、徐々に この現象の詳細が見えてくると思います。

Leave Comment

Your email address will not be published. Required fields are marked *

2 × four =