3 min read

Jennifer Aniston Deepfake Romance Scam: Victim Fooled by AI Impersonation

Alina BÎZGĂ

July 08, 2025

Promo Protect all your devices, without slowing them down.
Free 30-day trial
Jennifer Aniston Deepfake Romance Scam: Victim Fooled by AI Impersonation

Besides being a household name, Jennifer Aniston is also a household image for scammers. For more than a decade, cybercriminals have exploited her identity in scams ranging from spam and phishing emails to fake endorsements. Now they’re abusing her image in AI-generated romance fraud.

According to a Bitdefender study from 2013, Aniston ranked among the most misused celebrity names in spam campaigns. More recently, she’s appeared in fake celebrity endorsement scams, where her face and fabricated quotes are used to promote fake products.

But in the latest scam scenario, fraudsters took the deception to a whole new level, using deepfake videos of Aniston to deceive a 43-year-old man from Southampton into believing that America’s beloved girl next door was in love with him.  

It began with friendly videos calling him “babe” and escalated into emotional manipulation. Paul ultimately sent £200 in Apple gift cards, supposedly to help cover her “Apple subscription.”

The scam was executed with disturbing ease, with the scammer using just a few public photos, some voice samples, and free AI tools.

One of the most manipulative elements of this scam was how the fraudster created a false sense of exclusivity and urgency. According to messages shared with The Sun, the scammer warned the victim not to trust or interact with “fake accounts,” asking him to delete previous chats and avoid commenting on public pages.

The message was accompanied by a fake California driver's license that was edited to feature Jennifer Aniston’s name, birthday, and photo. By sharing this, the scammers simply wanted to prove to the victim that he was speaking with the real person.  

“You texting with the real Jennifer Aniston ok? Don’t go to Facebook to reply anyone anymore... so you don’t get into problems,” the scammer wrote.

And believe it or not, this is the new face of romance scams, where not even the person you see on the screen is real.

How AI and Celebrity Deepfakes Fuel the Next Generation of Romance Scams

Deepfakes are videos or audio clips created using AI to mimic the voice, face, and mannerisms of real people. With nothing more than publicly available photos and free AI tools, scammers can fabricate incredibly lifelike videos that look like they come directly from a celebrity.

In Paul’s case, he received videos of “Jennifer Aniston”  requesting small favors—such as help with an “Apple subscription.” The scammers persuaded him to send gift cards, a tactic that remains popular due to its anonymity and irreversibility.

Why Fake Celebrities Make Perfect Scam Bait?

As discussed in Bitdefender’s deep dive into celebrity romance scams, these scams exploit the one-sided emotional bonds people form with public figures over time. When someone you’ve admired from afar suddenly seems to reciprocate, it creates a powerful illusion that’s hard to resist.

There’s precedent, too. Victims of romance scammers have been targeted by deepfake versions of Brad Pitt, Keanu Reeves, Owen Wilson, Martin Henderson, and many others. One woman was defrauded of nearly £700,000 by someone impersonating Pitt online.

Today’s romance scams combine classic social engineering with cutting-edge tech. The process often looks like this:

  • Initial contact via social media or messaging apps.
  • AI-generated videos or voice notes that build rapport and trust.
  • Emotional manipulation, often involving urgency, secrecy, or fabricated emergencies.
  • Non-traditional payment methods, like Apple gift cards or cryptocurrency, which are difficult to trace.

Even romance-themed apps are being flooded with bots and AI personas mimicking famous faces, amplifying the scale of deception.

A screenshot of a cell phone

AI-generated content may be incorrect.

 

What to Do If You’re Targeted

If you suspect you’re interacting with a scammer using AI deepfakes:

  1. Do not send money or personal information.
  2. Take screenshots and save any suspicious videos or messages.
  3. Report the incident to Action Fraud (UK), the FTC (US), or your country’s cybercrime division.
  4. Use tools like Bitdefender Scamio to analyze suspicious messages or links in real time.

Survivor Insight: Ayleen Charlotte’s Story

To understand the emotional toll of romance scams, Bitdefender spoke with Ayleen Charlotte, a victim of the infamous Tinder Swindler. In her two-part interview, she reflects on the psychological damage, the difficult healing process, and how she reclaimed her voice to help others.

Her story highlights what many victims suffer behind the scenes (grief, shame, and confusion), and why support, awareness, and education are essential for healing.

Remember: AI isn’t just transforming how we work and create, it’s also redefining how scammers deceive. What used to be a badly written love letter is now a polished, smiling deepfake of your favorite celebrity.

tags


Author


Alina BÎZGĂ

Alina is a history buff passionate about cybersecurity and anything sci-fi, advocating Bitdefender technologies and solutions. She spends most of her time between her two feline friends and traveling.

View all posts

You might also like

Bookmarks


loader