Skip to content

Jennifer Aniston, the renowned actress, reportedly convinced a British man to part with his money in a sweet-talking interaction believed to be related to AI-driven communication.

AI-induced Deepfake Deception in Southampton, UK: Resident Paul Davis fell victim to a scam involving a deepfake portraying actress Jennifer Aniston, leading to a $255 loss.

AI Jennifer Aniston allegedly manipulates a British man into parting with his money.
AI Jennifer Aniston allegedly manipulates a British man into parting with his money.

In the digital age, a new form of financial fraud is on the rise - AI deepfake scams. These sophisticated schemes, involving the creation of highly convincing fake audio and video content, are increasingly being used to impersonate celebrities or corporate executives, manipulating victims into transferring money or divulging sensitive information.

One such victim was Paul Davis, a 43-year-old resident of Southampton, UK. Davis found himself communicating with an AI deepfake that impersonated American actress Jennifer Aniston. The deepfake used manipulative tactics such as "love bombing", using affectionate pet names, professing love, and even sending an image of Aniston holding a digitally altered sign that read "I Love You." However, the AI-generated messages to Davis were not just expressions of affection; they were frequently accompanied by doctored certificates and fake identification cards, and simultaneously solicited money.

The scam resulted in a loss of approximately $255 (£200) for Davis. Unfortunately, Davis is not alone. Scammers have been known to exploit advancements in AI technology, with romance-based scams emerging as a prevalent tactic, often targeting elderly individuals or those with limited technological proficiency.

This trend is not limited to Davis's experience. In January, a French woman was deceived by an AI deepfake of actor Brad Pitt, leading her to believe they were in a romantic relationship and ultimately defrauding her of her life savings. The woman was convinced to transfer approximately $850,000 to an account in Turkey, believing the funds were needed for Pitt's alleged cancer treatment.

The use of celebrity deepfakes in scams is becoming increasingly common. Deepfakes of celebrities—such as Elon Musk—are notably used in sophisticated crypto scams to lend false legitimacy to fraudulent investment opportunities or giveaways. These AI-generated videos or audio clips simulate endorsements or instructions from celebrities to trick victims into sending cryptocurrency or making payments.

The growing prevalence of such scams is alarming. According to Pindrop's 2025 Voice Intelligence & Security Report, deepfake-related fraud is expected to grow by 162% in 2025, making up an increasingly large share of total fraud attempts. Deepfaked calls alone are projected to surge by 155%, indicating a sharp rise in scams involving voice impersonation, including those mimicking well-known figures.

Moreover, deepfake generation has become commoditized, widely accessible, and easy to produce at scale. This democratization means scammers can mass-produce convincing fake identities and content, leading to organized and scaled fraud operations targeting not just individuals but also businesses.

The financial impact of these scams is significant. The FBI reported over $16 billion in losses from imposter scams in 2024, many of which increasingly involve AI-driven techniques including deepfakes. The sophistication and believability of celebrity deepfakes amplify their potential to defraud victims, contributing to the surge in financial scams and losses.

In light of these developments, it is crucial for individuals and businesses to stay vigilant against AI deepfake scams. Recognising the signs of such scams, such as unusual requests for money or personal information, and verifying the authenticity of communications before taking any action, can help protect against falling victim to these sophisticated schemes.

  1. In the rapidly advancing world of technology, general-news outlets often report on the increasing use of AI deepfakes not only in financial fraud but also in areas like romance-based scams, as demonstrated by the case of a French woman who was defrauded by an AI deepfake of Brad Pitt.
  2. As the popularity of cryptocurrency grows, so does the prevalence of AI deepfakes, with celebrities like Elon Musk being fraudulently used in sophisticated crypto scams to lend false legitimacy to fraudulent investment opportunities or giveaways, thereby posing a significant threat in the realm of crime-and-justice.

Read also:

    Latest