GMA Logo

A recent Wall Street Journal report has highlighted how, in March this year, a group of hackers were able to use AI software to mimic an energy company CEO’s voice in order to steal £201,000.

What Happened?

Reports indicate that the CEO of an unnamed UK-based energy company received a phone call from someone that he believed to be the German chief executive of the parent company.  The person on the end of the phone ordered the CEO of the UK-based energy company to immediately transfer €220,000 (£201,000) into the bank account of a Hungarian supplier.

The voice was reported to have been so accurate in its sound, that the CEO of the energy company even recognised what he thought was the subtleties of the German accent of his boss, and even “melody” of the accent. 

The call was so convincing that the energy company made the transfer of funds as requested.

Fraudster Using AI Software

The caller, who was later discovered to have been a fraudster using AI-base voice-altering software to simulate the voice of the German boss, called 3 times.  In the first call, the fraudster requested the transfer, in the second call they (falsely) claimed that the transfer had been reimbursed, and in the third call the fraudster requested an additional payment. It was this third call that aroused suspicion, partly based on the fact that the telephone number appeared to indicate that the caller was in Austria and not Hungary. 

Money To Hungary, Mexico and Beyond

Unfortunately, the money had already been transferred to a Hungarian account after the first call, and it has since been discovered that money was immediately transferred from the alleged supplier’s Hungarian bank account to an account in Mexico, and then further disbursed to accounts in other locations, thereby making it very difficult for authorities to follow the trail.

What Sort of Software?

The kind of software used in this attack may have been similar in its output to that demonstrated by researchers from Dessa, an AI company based in Toronto.  Dessa has produced a video of how this kind of software has been able to produce a relatively accurate simulation of the voice of popular podcaster and comedian Joe Rogan – see: https://www.youtube.com/watch?time_continue=1&v=DWK_iYBl8cA

What Does This Mean For Your Business?

It is known that cybercriminals, deterred by improved and more robust enterprise security practices have decided to look for human error and concentrate more on social engineering attacks, a category that this voice simulation attack (via phone calls) fits into. The fact that this attack has taken place and been successful shows that some cybercriminals are already equipped with the computing power and most up-to-date machine-learning AI technology that they are clearly capable of using.

This means that companies and organisations (particularly larger ones), may now be at risk of facing more sophisticated deception and phishing attacks. The AI company Dessa has suggested that organisations and even individuals could expect to face future threats such as  spam callers impersonating relatives or spouses to obtain personal information, impersonations intended to bully or harass, persons trying to gain entrance to high security clearance areas by impersonating a government officials, and even an ‘audio deepfake’ of a politician being used to manipulate election results or cause a social uprising.

Companies should try to guard against social engineering attacks by educating all staff to the risks and having clear verification procedures (and not just relying on phone calls), tests, and chain of command authorisation in place for any requests for funds.