Colors of Motherland | A gang stole $35 million from an Emirati bank under the guise of a new technology

تحويل أموال

I

A bank manager in the United Arab Emirates is subjected to a strange trick by shrewd thieves who managed to steal $35 million through an audio fraud method; The bank manager received a voice confirmation from the account holder to transfer the required amount.

Early last year, an unnamed bank manager received a United Arab Emirates United Arab Emirates, a call from a famous customer, she had spoken to him before and knew his voice. The man was excited that his company was about to make a significant acquisition, so he needed to move the $35 million as quickly as possible, according to Audi Central.

During the call, the customer stated that he had hired an attorney named Martin Zellner to handle the acquisition, and that the bank manager could check emails from the attorney in his inbox to make sure, at which point the bank manager had asked for the most Was determined to take advantage of more. The costly mistake of his career.

voice reproduction by artificial intelligence

The bank manager allowed the bank transfer, in the belief that everything was legitimate, but in reality, he fell victim to ita scam Elaborate hi-tech, the person who spoke to him on the phone used “deep voice” technology, which works with artificial intelligence to reproduce the voice and distinguish it from the original for anyone makes it impossible.

Further details about the case were not disclosed, and although it happened more than a year ago, the investigation is still ongoing, and the Emirati authorities, at the end of last week, for the first time took a look at the itinerary of the investigation. The copy was issued, which states that it is believed that at least 17 people were involved in the elaborate scheme and the money was sent to accounts around the world, making it difficult to trace and recover.

And cybercrime experts warn that this phenomenon is only the beginning, given the amazing speed at which voice reproduction technologies are developing, noting that criminals are profiting from it.

Experts warn of new voice reproduction techniques

Cybersecurity expert Jake Moore said rigging audio, which is easier to coordinate than to create deep fake videos, is soon becoming more prevalent, and without the education and awareness of this new type of technology, more companies fall victim to it. likely to happen. Danger.

He continued: “We are currently on the verge of malicious attacks that will spread using the latest technologies to manipulate those who are deeply into fake worlds and even innocently aware of their existence. are aware, and getting samples of one’s voice is not difficult in the world anymore, it is possible through a video on youtube. or facebook, or by a simple phone call, and all of these are used for malicious purposes can be used to clone the audio.

About the author: Seth Grace

"Social media trailblazer. Music junkie. Evil student. Introvert. Typical beer fan. Extreme web ninja. Tv fanatic. Total travel evangelist. Zombie guru."

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *