Home » OpenAI warns of the dangers of its own language tool

OpenAI warns of the dangers of its own language tool

by admin
OpenAI warns of the dangers of its own language tool

Developers at AI giant OpenAI know that their new language tool poses dangers and have suggestions for making the technology more secure. Jonathan Raa/Getty Images

OpenAI is aware of the dangers associated with using AI-generated language tools.

The AI ​​company revealed details about its voice engine tool, which has not yet fully launched.

The company cited its security measures – and would not confirm whether the model will even be widely released.

This is a machine translation of an article from our US colleagues at Business Insider. It was automatically translated and checked by a real editor.

Language tools developed using artificial intelligence (AI) are a tricky matter – especially in the face of fake news in the US election year. OpenAI knows this too. Now the company presented in one Blog post presents the first results of the test phase of its new tool for synthetic voices. At the same time, the AI ​​pioneer company addressed concerns about using AI to replicate human voices.

According to the company, OpenAI developed its Voice Engine tool in late 2022. It requires no more than a 15-second audio clip of a real person’s voice to create an eerily realistic, human-sounding replica of that voice. And users can have this voice say anything – even in other languages.

Read too

Some content creators no longer use AI because it is a “dead end” for creativity

The tool is not yet available to the public. Additionally, OpenAI says it is not yet clear whether and how this technology can be used on a large scale. “We recognize that generating speech that resembles people’s voices poses serious risks that are particularly important in an election year,” OpenAI writes in its blog post. “We are working with U.S. and international partners in government, media, entertainment, education, civil society and beyond to incorporate their feedback into our development.”

See also  The activity of the national carbon market continues to rise, and the cumulative turnover exceeds 4 billion yuan_performance_quota_trading

OpenAI currently uses the tool for ChatGPT’s reading capabilities as well as the company’s text-to-speech API.

Late last year, OpenAI began expanding the tool externally and working with “a small group of trusted partners.” The developer wants to use them to test Voice Engine for children’s educational materials, language translation and medical speech restoration, the company said in its post.

Risky technology, but opportunity for dialogue about responsible use?

OpenAI emphasized that its partner organizations must adhere to strict guidelines to use Voice Engine. For example, you must obtain the consent of each impersonated person. In addition, listeners must be informed that the voice was generated by AI.

“We are taking a cautious and informed approach to wider release due to the potential for misuse of synthetic voice,” the company wrote. “We hope to open a dialogue about the responsible use of synthetic voices and how society adapts to these new possibilities.”

Read too

Demand for a new AI role, Prompt Engineer, is skyrocketing – but will the hype last?

The company says it is not yet releasing its tool to the general public. But it urged policymakers and developers to take action to prevent dangerous misuse of the technology it developed.

For example, OpenAI suggested creating a “no-go voice list” to prevent the unauthorized replication of prominent voices, such as politicians or celebrities.

The company also recommended that banks stop using voice-based security authentication and that researchers develop techniques to determine whether a voice is real or fake.

Read the original article Business Insider.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy