Police in the United Arab Emirates are investigating a case where AI was allegedly used to clone a company director’s voice and steal $35 million in a massive heist. While it is a stunning and unusual crime, it is not the first time that fraudsters have resorted to AI-based voice-spoofing to carry off daring heists. A previous instance of such technology being used for a similar scam dates back to 2019, when criminals in the UK are said to have used deepfake software to impersonate the voice of the CEO of an energy firm to fraudulently transfer around $243,000.

While artificial intelligence is expected to open up a wave of opportunities in the coming years, the threats posed by the technology are also very real. Automation-spurred job losses are often thought to be the most pressing issue with AI, but the technology also poses serious challenges in other areas, including privacy threats by the rampant use of facial recognition, as well as audio and video deepfakes created by manipulating voices and likeness. While the former tends to get attention in the mainstream media, the latter also poses a grave threat, as exhibited in these fraud cases.

SCREENRANT VIDEO OF THE DAY

As reported by Forbes, the latest instance of manipulated voice being used for fraud happened in the UAE in early 2020, when criminals allegedly used AI to clone a company director’s voice to ask a bank manager to transfer funds worth $35 million for an acquisition. The bank manager duly made the transfers believing everything to be legitimate, only to realize later that it was an elaborate scam designed by high-tech criminals. As it turned out, the scammers had used ‘deep voice’ technology to dupe the manager and swindle the bank out of the massive amount.

The UAE Asks US Authorities For Help

According to a court document, the investigators in the UAE are now seeking help from US authorities to trace $400,000 of stolen funds that they believe are being held in US bank accounts. The rest of the funds are believed to be stored in many different banks under different names in various countries around the world. According to the UAE authorities, at least seventeen people are involved in the scheme, although their names and nationalities were not immediately clear.

See also  Supergirl Season 4 Rectifies A Season 2 Villain Mistake

Talking to Forbes, Jake Moore, an expert at cyber-security firm ESET said that audio and video ‘deepfake’ technology can be a real issue in the hands of the wrong people, with deepfakes posing “a huge threat to data, money and businesses.” Moore also added that an increased number of businesses are likely to fall victims to similar realistic deepfake audio scams in the future.

Source: Forbes

Road 96 PS5 Review: A Wonderful Slice Of Counterculture