At the weekend, the police in Hong Kong told about the attempted fraud against a financier in a large international company in January.
He is said to have been contacted by the CFO of the company, who said he wanted to discuss some sensitive transactions, according to the national broadcaster RTHK.
“Deepfake” technology under attack
The financier had at first been puzzled by the request. When he later discussed the matter in a video conference with what he thought were several of his own colleagues, he turned around.
The fraudsters used “deepfake” technology in the attack, according to the police in Hong Kong. Using the technology, they faked both the video and the voices of their colleagues, and tricked the financier.
He eventually made 15 transactions totaling around NOK 272 million at today’s exchange rate.
– Makes the attack very believable
Økokrim is now asking both people and businesses in Norway to be vigilant.
Department director for prevention and intelligence in Økokrim, Lone Charlotte Pettersen.
Photo: Ecocrime
– Criminals have started to use “deepfake” technology, which makes the attacks very credible, says Department Director for Prevention and Intelligence in Økokrim, Lone Charlotte Pettersen.
– It is expected that criminals will use the technology in several forms of fraud in the future
Økokrim registers that fraud attempts are becoming more and more targeted and sophisticated. In recent years, there has been a significant increase in the number of digital frauds, and Økokrim believes it is now a social problem.
What is a “deepfake”?
Expand/minimize fact box
A deepfake is an image, sound or video that has been manipulated using artificial intelligence. Such content can make it seem as if someone has said something they have not said, or done something they have not done. It can be used for a lot of nonsense, but also misused to create fake pornographic images of celebrities, or make it look like a head of state is coming up with a serious plan.
Will be used in director fraud
In its threat assessment, Økokrim highlighted that it is very likely that “deepfake” will be used to an increasing extent in director fraud.
– There is no reason to believe that we are less exposed to such fraud here than in Hong Kong, says Pettersen.
NRK has previously written that several researchers believe that 2024 will be the year of “deepfakes”, after several fake nude photos of Taylor Swift were spread in January.
The issue surrounding “deepfake” will be the topic of Urix on Wednesday at 9.20pm i NRK TV or on NRK1.