Deepfakes: What to Do When You Can’t Trust What You See and Hear

Find out about what deepfake technology is, how it’s being used to defraud and deceive consumers, and how you can help protect yourself from becoming a victim. 

Deepfake technology uses artificial intelligence (AI) to alter audio, video, and imagery from their original forms and make them appear authentic. There are many ways criminals use deepfake technology to stoke fear or build trust to manipulate their victims.

How deepfakes can deceive and defraud consumers

  • Financial Fraud: Deepfakes can be used in blackmail schemes or to impersonate people in video calls or audio recordings to request money.
  • Reputation Damage: Deepfakes can be used to create false video or audio of individuals making damaging statements or behaving inappropriately.
  • Phishing: Cybercriminals can use deepfake technology to trick people into disclosing sensitive information, thinking they are interacting with a legitimate business or organization.
  • Authentication Disruption: Deepfakes can be used to deceive identity verification technologies, such as facial recognition and voice recognition, to access sensitive information and financial accounts.
  • Manipulating Trust: Deepfakes can be used to generate phony customer reviews or testimonials, making fraudulent goods seem credible and leading consumers to purchase fake or substandard products.

How to avoid falling victim to malicious deepfakes

Watch what you share online

Be cautious with the personal information, photos, and videos you share online. This limits the amount of source material that could be used to create deepfake impersonations of you or your loved ones. Stay on top of your social media privacy settings, and only accept friend requests from people you know.

Verify requests for money or sensitive information

If you receive a suspicious or urgent request from someone you know, or if something seems out of character for a trusted source, verify the legitimacy of the request directly with that person or organization by contacting them through a known phone number or channel before you respond.

Learn to spot the fakes
  • Watch for unnatural facial expressions or body movements, like unnatural blinking, strange facial expressions, or mismatched lip movement and speech.
  • Look for flaws in the background, such as blurry objects and unnatural lighting or shadows.
  • Pay attention to imperfections in audio quality, speech patterns, tone, and the alignment of pace with facial movements.
  • Look for watermarks. Some social media platforms are employing deepfake detection to flag potentially altered content.
Create secure passwords

Develop passwords that are unique for each account, have at least 15 characters, and use a mix of upper- and lowercase letters, numbers, and special characters.

Use additional verification procedures

Use multi-factor authentication whenever possible, which employs extra security measures such as one-time passcodes and facial recognition to verify your identity.

Knowledge is power

When it comes to avoiding malicious deepfake fraud schemes, education is your first line of defense. KeyBank is here to help provide the latest information and resources. Learn more about our commitment to fraud prevention at key.com/fraud.

 

What to do if you think you’ve been defrauded by deepfake media

If you think your financial accounts may be at risk, contact KeyBank as soon as possible. We can check to see if your accounts have been compromised and take measures to help prevent any further fraudulent activity.

Call the KeyBank Fraud Client Service Center at 1-800-433-0124, or dial 711 for TTY/TRS.

Content provided for informational and educational purposes only and is in no way to be construed as financial, investment, or legal advice. We cannot and do not guarantee their applicability or accuracy in regards to your individual circumstances. All examples are hypothetical and are for illustrative purposes. We encourage you to seek personalized advice from qualified professionals regarding all personal financial issues.