Deep Fakes are getting harder to spot

ChatGPT Image Mar 11, 2026, 06_49_36 PM

Deepfakes are AI generated media where a computer system replaces a person’s face, voice, or actions in a video, photo, or audio clip so it looks real. The technology uses machine learning models that study thousands of images or recordings of someone and then recreate them convincingly. The result can look and sound authentic even though the event never happened.

The problem begins when people accept what they see or hear without checking the source.

Why Deepfakes Are Dangerous

1. False Evidence

A deepfake can show a person saying or doing something they never did.
This can be used to fabricate scandals, crimes, or statements.

Examples:

  • A fake video of a politician declaring war
  • A CEO “announcing” company bankruptcy
  • A teacher or public figure appearing to commit misconduct

Even if the video is later proven fake, the damage is already done because people remember the first impression.

2. Financial Scams

Deepfake voice cloning is already used in fraud.

A criminal can clone a company executive’s voice and call an employee saying:

“Transfer the money now. This is urgent.”

Employees often comply because the voice sounds exactly like their boss.

Several companies have already lost millions this way.

3. Identity Theft and Reputation Damage

Someone can create fake videos or images of a person in illegal or embarrassing situations.

Victims often struggle to prove the content is fake because the video looks authentic.

This can destroy careers or relationships before the truth comes out.

4. Political Manipulation

Deepfakes can influence elections or public opinion.

For example:

  • Fake speeches
  • Fake confessions
  • Fake war footage

If released right before an election or major event, there may not be enough time to verify it before people react.

5. Loss of Trust in Reality

This is the long term danger.

Once deepfakes become common, people begin to doubt everything.

Two things happen at the same time:

  • Real events get dismissed as fake
  • Fake events get believed as real

This creates what researchers call the “liar’s dividend.”
A real video can be dismissed by simply saying, “That’s a deepfake.”

Why Research and Verification Matters

Because of deepfakes, information must be verified before accepting or sharing it.

Basic checks include:

Check the source

  • Who posted it first?
  • Is it from a reliable outlet?

Look for original footage

  • If only one random account has the video, that is suspicious.

Search for confirmation

  • Major events are reported by multiple credible organizations.

Watch for technical clues
Deepfakes sometimes show:

  • unnatural blinking
  • lighting mismatches
  • lip sync errors
  • distorted hands or edges

AI is improving quickly, so technical clues alone are not enough.

The Bigger Picture

Humans evolved to trust what they see and hear. For thousands of years video evidence was considered the strongest proof of reality. Deepfakes break that assumption.

Society is now entering a period where seeing is no longer believing. The defense against this is not just better technology but better thinking habits: verifying sources, checking context, and resisting the urge to react instantly.

The strange twist is that the same AI tools that create deepfakes are also being used to detect them. It has become an arms race between deception and verification. The outcome will depend less on technology and more on whether people slow down and investigate information before accepting it as truth.

MGM another breach

NOTICE OF DATA BREACH – EMAILED FROM MGM Dear reader, We recently learned of a cybersecurity issue affecting our company.

Read More »

Okta Breach

The recent breach at Okta, a reputable identity and access management service provider, has rung alarm bells in the cybersecurity

Read More »

23 and me was hacked

I want to alert you to some troubling news regarding the DNA Ancestry company, which recently suffered a security breach.

Read More »

Table of Contents

Leave a Reply

Your email address will not be published. Required fields are marked *