October 2024 - Dealing With Deepfakes in The Corporate Setting
  • 31 Oct 2024
  • 4 Minutes to read
  • Contributors
  • Dark
    Light

October 2024 - Dealing With Deepfakes in The Corporate Setting

  • Dark
    Light

Article summary

Deepfake technology has become a vexing issue in the privacy and cybersecurity space. Bad actors are leveraging deepfakes to conduct cyberscams and harass people from all walks of life.

When you hear about deepfake technology, your mind probably goes to one of those personal attacks. Perhaps a bad actor has placed the facial likeness of a famous person onto explicit content, with the hopes of either blackmailing the victim for a sum of money, or to simply harm their reputation.

It’s naive to believe this is the only way deepfake technology can be harmful.

In fact, deepfake technology can be leveraged against those who work in business environments, and the result can be disastrous for any organization.

This blog post will explain how deepfakes can be weaponized in a corporate environment, and what security professionals can do to protect their organizations from losing valuable assets when this technology is deployed.

How Deepfake Technology Works

Bad actors use deepfake technology to impersonate another individual, with the goal of convincing a potential target that they are talking to a legitimate human being.

This can be done by either impersonating their voice, or their likeness. To do this, a bad actor would need to collect a lot of data on the person they wish to impersonate. They may collect images, videos and audio recordings. After they have the data set, the bad actor will use machine learning and artificial intelligence to create a model that will mimic a person’s vocal patterns and facial expressions.

Bad actors want to collect as much information as possible on the person they are impersonating, as the more data they have at their disposal, the more realistic the imitation will be. This can be incredibly troubling if bad actors want to impersonate a public figure. For example, if a deepfake dataset is trained on hundreds of audio recordings of a particular individual, it will be able to analyze speaking patterns and cadences.

Bad actors can then tell the model to say whatever it wants, and in turn, the model will say it with the voice of the person it was trained on. It can also be used to analyze facial expressions, which bad actors can use to create fake videos.

When Deepfake Tech is Weaponized in a Corporate Setting

Deepfake technology can be incredibly deceiving, and in a corporate setting, it can also be used to steal valuable assets from an organization.

It starts at the executive level, but not in the way you may think.
A bad actor wants to make a deepfake model to impersonate a high ranking executive. For this example, let’s say the executive has made various public speaking appearances. The bad actor collects this data and makes a model of the executive’s voice.

The bad actor takes this voice model and gets in touch with an employee in the company. Using the voice model, the bad actor tells the employee to make a sensitive money transfer using the voice of the executive.

If the model is accurate enough, the employee may not think twice about fulfilling the request. The employee is told where to send the money, not knowing they are sending it to a bad actor. This can also be done to make requests for login credentials, valuable corporate information or any other internal data a bad actor would want to get their hands on.

Of course, as mentioned earlier, a bad actor may create a deepfake and make the model say offensive statements appearing to come from that executive. This could end up harming the entire organization’s reputation, which could cause its own share of headaches.

Deepfake technology has the potential to be rather accurate if used correctly. Security professionals should take the proper steps to ensure it is not used to harm an organization in any manner.

Validation is the Name of the Game

Deepfake technology can wreak havoc in your organization if you aren’t prepared. Security professionals would benefit from taking the time to develop strategies to combat deepfake models that could be used against their place of employment.

The best place to start is with education. Security professionals should educate their employees on how deepfake models can be used to steal assets from their organization. Examples may be helpful here. While deepfake audio can be convincing, it isn’t perfect. If the voice on the other end of the line sounds off in any way, it could be a bad actor waiting.

Audio deepfakes may sound flat and lifeless, and there may be longer pauses between words and sentences. This bit of advice could be the difference between keeping an organization safe, and losing valuable information and money.

It’s also highly recommended that each organization builds in verification measures any time a request is made, regardless of what the request is. Employees should receive in-person confirmation when they are asked to send over anything of value.

It doesn’t have to end there. Security professionals can look into various tools designed to identify deepfake models. It may also be helpful to advise executives to lock down social media accounts to prevent bad actors from getting their hands on audio and video recordings which may be used to make even more convincing deepfake models.

Deepfake technology is the latest deception tactic deployed by bad actors in their mission to get their hands on important, valuable items. Even though it may appear to be a daunting proposition, security professionals can keep their organization safe by staying on top of this development, and any other cyberscams that come down the line.


Was this article helpful?

Changing your password will log you out immediately. Use the new password to log back in.
First name must have atleast 2 characters. Numbers and special characters are not allowed.
Last name must have atleast 1 characters. Numbers and special characters are not allowed.
Enter a valid email
Enter a valid password
Your profile has been successfully updated.