Connect with us

Business

OpenAI Worries People May Become Emotionally Reliant On Its New ChatGPT Voice Mode

Published

on

OpenAI

Because of its new human-sounding voice mode, OpenAI is concerned that people would become overly reliant on ChatGPT for companionship, possibly leading to “dependence.”

That disclosure was made in a study released Thursday by OpenAI on the safety analysis it conducted of the tool, which began rolling out to paid users last week, and the huge language AI model it operates on.

ChatGPT’s enhanced speech mode sounds incredibly realistic. It answers in real-time, can adjust to interruptions, and creates human-like noises during discussions, such as giggling or “hmms.” The tone of speech can also determine an individual’s emotional condition.

Within minutes of OpenAI’s announcement of the feature at an event earlier this year, it was compared to the AI digital assistant in the 2013 film “Her,” with whom the protagonist falls in love, only to be saddened when the AI discloses “she” has connections with hundreds of other users.

chatgpt

OpenAI Worries People May Become Emotionally Reliant On Its New ChatGPT Voice Mode

Now, OpenAI appears to be afraid that a fictional scenario is getting too close to becoming a reality after observing users chatting in ChatGPT’s voice mode in language, “expressing shared bonds” with the tool.

Later on, “users might form social relationships with the AI, reducing their need for human interaction — potentially benefiting lonely individuals but possibly affecting healthy relationships,” according to the article. It says that hearing information from a bot that sounds like a human may induce consumers to trust the tool more than they should, given AI’s tendency to make mistakes.

The report emphasizes a major risk associated with artificial intelligence: tech companies are racing to release AI products to the public that they claim will transform the way we live, work, socialize, and discover information. But they’re doing it before anyone knows what the repercussions are. As with many technological developments, firms generally have a single vision for how their products should be used, while consumers come up with many different potential applications, often with unforeseen effects.

chatgpt

Some people are already creating what they call romantic relationships with AI chatbots, which has raised concerns among relationship experts.

“It’s a lot of responsibility on companies to really navigate this in an ethical and responsible way, and it’s all in the experimentation phase right now,” Liesel Sharabi, an Arizona State University Professor who researches technology and human communication, told CNN in a June interview. “I do worry about people who are forming really deep connections with a technology that might not exist in the long-run and that is constantly evolving.”

According to OpenAI, human users’ interactions with ChatGPT’s voice mode may eventually affect what is deemed typical in social interactions.

chatgpt

OpenAI Worries People May Become Emotionally Reliant On Its New ChatGPT Voice Mode

“Our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions,” according to the report from the business.

For now, OpenAI says it is committed to developing AI “safely,” and it intends to continue researching the possibility of users’ “emotional reliance” on its tools.

SOURCE | CNN

Kiara Grace is a staff writer at VORNews, a reputable online publication. Her writing focuses on technology trends, particularly in the realm of consumer electronics and software. With a keen eye for detail and a knack for breaking down complex topics.

Download Our App

vornews app

Volunteering at Soi Dog

Soi Dog

Buy FUT Coins

comprar monedas FC 25