Home » Deepfakes: When scammers use AI to do the grandchild trick | > – Guide

Deepfakes: When scammers use AI to do the grandchild trick | > – Guide

by admin
Deepfakes: When scammers use AI to do the grandchild trick |  > – Guide

Status: 07/21/2023 5:26 p.m

The so-called grandchild trick, in which scammers pretend to be relatives in order to get money, is well known. Now the voices can even be imitated in a deceptively real way – with the help of artificial intelligence.

by Svea Eckert

“Audio Deepfake” is the name of the technology. A large number of voice samples from a person are entered into the computer and the almost identical sounding voice is then synthesized at the push of a button, so that audio files can be output via text input. On the one hand, this opens up many chances and possibilities, for example for audio books or newspaper articles read aloud, but it also harbors risks. Because even criminals can use the technology to imitate someone.

What are the scams and who does it affect?

CEO Scam: According to experts, audio deepfakes are used, for example, in the so-called “Fake President” scam. Here scammers pose as senior executives of a company and use the audio deepfakes to mimic the voice of the CEO (Chief Executive Officer) or other key people. You can then get employees or finance departments to release funds or reveal sensitive information. The reinsurer Allianz Trade reports on a case involving a German energy supplier, in which the fraudsters managed to steal 220,000 euros. In another case, in which an audio deepfake was allegedly used, $35 million was stolen.

Phone scams: Scammers can use audio deepfakes to mimic the voices of relatives, friends, or other trusted individuals to trick their victims into transferring money or revealing sensitive information. When asked by the NDR, the state criminal investigation offices in northern Germany have not yet received any indications as to whether this scam is already being used in northern Germany. Only from the USA are there already initial reports that the technology could also have been used in telephone or WhatsApp fraud.

See also  Stocks: ChatGPT, Nvidia, ASML - These 20 stocks are winners of the AI ​​revolution

Political manipulation: Audio deepfakes can be used to create fake statements or messages from political figures to sway voters or create sentiment. This has happened many times in the past, but was usually quickly uncovered so that no damage was caused.

In a report, Europol assumes additional dangers for investigators if, for example, witness statements or confessions can no longer be classified as authentic.

How does the grandson trick with audio deepfakes work?

There are now numerous free and chargeable services on the Internet that can be used to create audio deepfakes or where already finished voices can be “booked”. To do this, the providers need a few high-quality voice recordings of the person whose voice is to be imitated. The better these recordings are, the more accurately the artificial intelligence can imitate the voice.

Responsible for the “learning process” in the computer are so-called artificial neural networks, which train the voice with the help of machine learning. If the voice is available, it is enough to enter the sentences or words by entering text, which the computer can then modulate. There are often additional controls or instruments to adjust the voice. For example, to increase pauses in speaking, to increase or darken the voice color.

How do you recognize the scam and protect yourself?

A well-done audio deepfake is almost impossible or difficult to detect, especially over the phone. The general rule is to keep calm when you get excited, call in a second person and quickly call the alleged grandchild or the alleged supervisor back using the known number. Code words or asking about specific places or circumstances that only the person concerned knows can also help. The consumer centers also give tips.

See also  Intel 14th-generation Core adds VPU unit: AI performance soars, lower power consumption - Fast Technology

In addition, the development of audio deepfakes is also driving the development of technologies for detecting and verifying fake audio files. Researchers and developers are working to improve algorithms that can identify such deepfakes to protect the integrity of audio recordings.

Further information

According to the police, shock calls are a growing problem in Hamburg. Last year, fraudsters stole four million euros. more

Fraudsters try to take money from their victims on the phone, via WhatsApp or at the front door. This is how you recognize common stitches. more

This topic in the program:

market | 07/24/2023 | 8:15 p.m

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy