All news

The Ministry of Internal Affairs warned about fraud schemes using neural networks

- This information is not reliable. The sound in the video is artificially generated, and the video used is taken from another article previously published by the Ministry of Internal Affairs of the Kursk Region.

We ask citizens to pay attention and check information on social networks. The opinions of the employees of the Ministry of Internal Affairs of Russia about the Kursk region are published on the official departmental Internet resource, the Main Directorate of Internal Affairs informed citizens.

This type of fraud, which involves forging the image and voice of an unsuspecting person, is only just beginning to gain momentum in our country, and the consequences are very dangerous. However, the level of images and voices generated by artificial intelligence cannot yet be understood not only by ordinary citizens, but also by experts who have already sounded the alarm.

An example of how easy it is to deceive people with the help of artificial intelligence is a crime that took place not long ago in Altai. There, scammers forged audio and video recordings of Barnaul residents. And in this way, they managed to steal a lot of money from her friends.

Dozens of this citizen's contacts received fake messages from the scammer. Her name is Marina Vasilenko. Her voice was in messages to friends and acquaintances, but the girl herself did not send anything to anyone.

The first five people from her contact list immediately responded to the request and sent more than 100,000 rubles to the unknown card.

Their plan of action is as follows: Fraudsters hacked Marina's Telegram account. In the conversation, I found video and audio messages. With the help of artificial intelligence, they created something new, but with the right text. Old friends and relatives did not suspect anything. In the end, they not only heard a familiar voice, but also received a video message from the girl. The citizen in question has already filed a report with the police. In the Altai Territory, this is the first case of fraud using neural networks. But the police are confident that this incident will not be the last. According to experts, it was previously possible to fake another person's voice, as well as create a deepfake - a video containing another person's face. But a few years ago, this took several days. But now it takes less than a minute to create such a "talking" image.

Police warn citizens. If a person receives a video and a familiar voice urgently asks to borrow money, it is better not to rush and "not to believe your eyes." Law enforcement officials advise: Before taking any action, you should call the person asking for help again.

The Ministry of Internal Affairs experts have calculated that this year alone, residents of the Altai Territory have already lost more than 1 billion rubles due to the actions of fraudsters in the field of information technology. How many such situations are there in the country? There are no such calculations yet, but it is already clear that the numbers will be very large.

Experts say it takes just a few days to fake someone's voice, just like it takes a video of their face. But now it takes less than a minute

Here's what the new plan of tricks for Telegram users looks like today.

Phone scammers gain access to an account and then start messaging potential victims from the owner's contact list asking them to transfer money. The most important thing is to transfer as soon as possible.

For added credibility, a voice message from the account owner has been added. Audio messages use excerpts from previous voice messages. This audio message will be delivered to all chats in which the account owner participates.

This plan is still new for our country, but has already been "tested" in other countries. However, a way to counter it has not yet been found.


Source: Российская Газета: издание Правительства РФРоссийская Газета: издание Правительства РФ

Loading...
follow the news
Stay up to date with the latest news and updates! Subscribe to our browser updates and be the first to receive the latest notifications.
© АС РАЗВОРОТ.