Home » Kaspersky, deepfake voice AI can deceive users

Kaspersky, deepfake voice AI can deceive users

by admin
Kaspersky, deepfake voice AI can deceive users

Artificial intelligence it can also be used to reconstruct voices and songs, however i voice deepfake can be generated to deceive users and consumers.

The Beatles have once again thrilled millions of fans around the world by releasing a new song thanks to Artificial Intelligence (AI), which combines parts of an old recording and improves its audio quality. While there is great excitement about the band’s new work, there is also a darker side to the use of AI to create false rumors and images.

Fortunately, at the moment, these deepfakes and the tools used to make them they are not yet well developed or widespread. However, their potential use in fraud is extremely high and the technology continues to evolve.

What are voice deepfakes capable of?

Recently, Open AI presented an Audio API model capable of creating human speech and voice messages. So far, only this Open AI software is the closest to real human language.

In the future, these models could become a new tool in the hands of cybercriminals. API Audio can play the requested text vocally, allowing users to choose which to use from the suggested voice options. Currently, the Open AI model cannot be used to create voice deepfakes, but it is indicative of the rapid development of voice generation technologies.

Today there is no device capable of produce high-quality deepfake voice, which is indistinguishable from real human speech. However, in recent months more and more tools for generating the human voice have been released. Previously, users needed basic programming skills, but now it is becoming easier to work with these tools. Soon, you can expect to see models that combine ease of use and quality of results.

See also  Chat GPT: Tips to make money with AI text editing

Frauds that exploit artificial intelligence are not frequent, but there are already examples of “successful” cases. In mid-October 2023, il venture capitalist americano Tim Draper warned his Twitter followers that scammers may use his voice for fraud. Tim shared that the requests for money made by his voice are the result of artificial intelligence, which is obviously becoming more and more sophisticated.

How to protect yourself?

So far, society does not perceive voice deepfakes as a possible cyber threat. There are very few cases in which they are used with malicious intentions; therefore, protection technologies are slow to spread.

For the moment, the best way to protect yourself is to listen carefully to the words of the interlocutor on the phone. If the quality of the recording is low, contains noise and the voice sounds robotic, you should not trust the information you hear.

Another way to test your interlocutor’s “humanity” is to ask unusual questions. For example, if the interlocutor were a speech model, a question about his favorite color would leave him perplexed, since this is not what the victim of fraud usually asks. In this case, even if the attacker manually dials the number and plays the answer, the delay in the answer will make it clear that the user has been deceived.

Another option is to install a reliable and comprehensive security solution. While it cannot 100% detect voice deepfakes, it can help users avoid suspicious websites, payments, and malware downloads by protecting browsers and checking all files on the computer.

Dmitry Anikin, Senior Data Scientist di Kaspersky
The main advice at the moment is not to exaggerate the threat or try to recognize voice deepfakes where they don’t exist. For now, current technology is unlikely to be powerful enough to create a voice that a human would not be able to recognize as artificial. However, you need to be aware of the possible threats and prepare for advanced deepfake frauds to become a reality in the near future.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy