How to protect yourself from the danger of Deepfake: How much money did a pensioner from Vaslui or a director from Great Britain lose – DNSC Guide

How to protect yourself from the danger of Deepfake: How much money did a pensioner from Vaslui or a director from Great Britain lose – DNSC Guide
How to protect yourself from the danger of Deepfake: How much money did a pensioner from Vaslui or a director from Great Britain lose – DNSC Guide
--

Attackers can perfectly imitate a loved one’s voice to trick you and steal your money using Deepfake technology, the National Cyber ​​Security Directorate (DNSC) warned on Monday, which released a guide on how to identify and defend against the phenomenon. This year, a pensioner from Vaslui lost 52,000 lei, and in 2019, a Deepfake audio was used to defraud the general manager of a UK company of 220,000 euros.

deepfakePhoto: Shutterstock

The danger of Deepfake: The technology can be used in fraud or manipulation of public opinion

Deepfake technology uses advanced Artificial Intelligence techniques to create fake image or audio-video content that can be used in fraud attempts, such as spoofing your identity in phone or video calls to gain unauthorized access to information or financial resources .

  • “One example: There are multiple cases where a video of a public figure gives financial advice. In this video, the person strongly recommends investing in a certain company or business, promising high returns and minimal risks.
  • Details matter: The video is professionally made with high image and sound quality. The public persona appears sincere and persuasive, using accessible language and persuasive arguments.
  • The purpose of the campaign: Such a Deepfake could be used to manipulate the population into investing in an illicit business (scam1) owned or controlled by manipulators. Impact: The video is quickly shared on social media with a huge potential to be viewed by millions of people. Many of those who see the video may be persuaded to invest in the recommended business, at the risk of subsequently losing significant amounts of money.” warns DNSC.

Deepfake technology can be used to create false records implicating people in activities they did not perform or to produce false evidence in legal contexts.

It can also be used in fraud attempts, such as spoofing your identity in phone or video calls to gain unauthorized access to information or financial resources.

In politics and other fields, Deepfakes can be used to manipulate public opinion, discredit opponents, or create confusion. They can influence elections, international relations and fuel conspiracy theories.

Real examples of using Deepfake: Fake content with Isărescu and people who lost money

DNSC recalls that the National Bank of Romania (BNR) recently warned the public about a fraud scheme involving the use of Deepfake technology to create fake videos of the governor of the BNR.

In these videos, the governor appears to be promoting an investment platform, but the NBR has declared them to be fake. The scam uses AI to alter the governor’s voice and image in order to mislead the public into participating in fraudulent investments, promising quick and easy financial gains.

Another case is that of a retired woman from Vaslui, who was the victim of a scam posted online on the YouTube platform. Scammers created a fake video in which a well-known banker and other well-known personalities recommended an investment platform.

The promise of quick profits convinced the pensioner to invest the sum of 52,000 lei, savings gathered in 20 years of work. Even if the woman reported this fraud to the authorities, experts believe that the chances of recovering the lost funds are minimal.

“Malicious actors are becoming more inventive. Now I can perfectly imitate the voice of a loved one to fool you. They can call you pretending to be a family member who needs money urgently. We recommend that the identity of the caller is always verified, even if it appears to be a known relative or friend. Never send money in a hurry and notify the authorities immediately if you have any suspicions. Only through vigilance and information can you protect yourself from this form of fraud,” DNSC warns.

In 2019, an audio Deepfake was used to defraud a CEO of €220,000.

  • “The managing director of a UK-based energy firm thought he was on the phone with the managing director of its German parent company when he followed orders to immediately transfer €220,000 to a Hungarian supplier’s bank account. The voice actually belonged to a fraudster using AI voice technology to impersonate the German CEO,” DNSC says.

The danger of Deepfake in the electoral context

Deepfake technologies, capable of realistically synthesizing images and voices, can significantly influence the opinion and vote of voters during election campaigns.

  • “A politician’s reputation can be seriously damaged by fabricated Deepfake videos, which can discredit his image and damage his chances of winning the election. Spreading misinformation through Deepfake can manipulate public perception of a politician’s character and agenda, negatively affecting their political career.
  • Deepfake can be used as a strategic tool to discredit rival parties by creating false material that portrays them in a negative light,” the DNSC guide states.

How to spot a deepfake: Speech/sound and lip movement out of sync when pronouncing the letters b, m and p

DNSC experts say there are certain clues that can betray a Deepfake, here’s what to look out for:

  • Surroundings (e.g. missing shadows, too strong reflections, blurry areas)
  • Facial imperfections (unrealistic moles, unsynchronized blinking, distortions inside the mouth such as missing teeth and tongue, teeth that are too perfect, etc.)
  • Out of sync with speech/sound and lip movement eg: due to sneezing.

“Speech/sound and lip movement out of sync can be seen when pronouncing the letters b, m and p. Sometimes grayscale pixels appear at the edges of the modified components. It can be distinguished if it is a forgery and when the person in the recording is viewed from a different angle. If photos of the person from different angles were not used to create the Deepfake content, the algorithm cannot infer the person’s appearance from another angle, resulting in distortions.” says DNSC.

Tips to avoid being fooled by Deepfakes

Avoiding Deepfake scams requires a combination of healthy skepticism, attention to detail, and the use of verification tools. Here are some helpful tips:

  • Don’t believe everything you see online! The Internet is a vast source of information, but not all of it is true. It is important to develop a healthy skepticism and carefully analyze any video or photo content before accepting it as real.
  • Look for signs of manipulation: Deepfakes can be very sophisticated, but they can often be identified by certain clues. Watch for lighting discrepancies, alignment errors, skin irregularities, or lip-syncing issues with the sound.
  • Check the source: Where is the video or image from? Is it distributed on a trusted platform? Seek confirmation of information from credible sources or directly from the entities or individuals involved.
  • Use verification tools: There are many organizations and online tools that can help you verify that a piece of information is real. Use them to investigate the authenticity of suspicious content.
  • Don’t rely on just one source: Seek confirmation from multiple credible sources. A single video or image is not enough to verify a piece of information.
  • Learn about Deepfakes: The better you understand how this technology works, the better you’ll be able to spot fakes. There are many online resources that explain the principles of Deepfakes and detection methods.

By applying these tips, you can reduce the risk of being scammed by Deepfake content and help promote a culture of verification and accountability online.

What to do if you are the victim of a Deepfake

If you find yourself in the unfortunate position of being the victim of a Deepfake, it’s important to act quickly and effectively to minimize the damage. Here are some steps you can follow:

  • Document the abuse: Save copies of Deepfake content, including URLs, screenshots, or any other form of evidence that might be relevant. This is essential for any legal action or subsequent reporting.
  • Report content: Most social media platforms and websites have strict policies against Deepfakes and manipulated content. Users can easily report suspicious content using the platform’s built-in reporting feature.
  • Contact the authorities: In serious cases, where Deepfake content violates laws regarding defamation, harassment, or distribution of pornography without consent, you may need to contact local authorities or other law enforcement agencies.
  • Request legal help: Consult an attorney to evaluate the legal options available to you. This may include legal action against those who created or shared the Deepfake content.
  • Use online reputation management services: There are companies that specialize in improving your online presence and removing or lessening the impact of negative content. These services can be useful to protect your image in the long run.
  • Communicate with transparency: If the Deepfake has the potential to affect your career or personal relationships, consider talking openly about the situation with your employer, colleagues or loved ones. Providing context and your side of the story can help lessen the negative impact.
  • Protect your personal information: In the wake of a Deepfake incident, it’s important to be extra cautious about online security. Check your social media privacy settings, change passwords and monitor your account activity for signs of unauthorized access.
  • Emotional support: The psychological trauma suffered by the victim of a Deepfake can be significant. Don’t hesitate to seek support from friends, family or mental health professionals.
  • Education and Awareness: Help raise awareness about the dangers of Deepfakes by sharing your experience if you feel comfortable. This can help inform and protect others.

For details, read also: Isărescu reported “Isărescu” to Zuckerberg for deepfake / Meta Answer: Community standards are respected

The article is in Romanian

Tags: protect danger Deepfake money pensioner Vaslui director Great Britain lose DNSC Guide

-

PREV The Trei Parale band, nominated at the Radio Romania Cultural Awards Gala, in the Music category | PODCAST | PRC Awards Gala 2024
NEXT The “ants” of the Social Canteen go to “Chefs with knives”