In our digital age where misinformation travels faster than truth, the proliferation of deepfake videos and fabricated endorsements has become a dangerous and pervasive threat. One of the most recent and disturbing trends to hit the United Kingdom is the widespread circulation of fake video clips on social media platforms. These videos falsely depict such highly revered public figures as Prime Minister Sir Keir Starmer, Prince William, and even His Majesty King Charles III, endorsing dubious crypto-currency and stock investment schemes, particularly one known as Quantum AI.
The strategy behind these scams is as insidious as it is clever: leveraging the trust and credibility of national figures to bait unsuspecting citizens into investing in fraudulent platforms. With these personalities positioned as purported advocates of a “revolutionary” trading technology, it becomes nearly impossible for the average viewer to discern between fiction and reality. However, the most disturbing aspect of this deceit is not just the creation and launch of the deepfakes, it is the ominous silence of the very individuals whose identities are being hijacked.
Quantum AI, a platform purportedly powered by artificial intelligence to provide groundbreaking returns on crypto investments, is a recurrent name in these fake advertisements. These scams often feature professionally edited videos, convincingly manipulated to show public figures discussing how they have personally benefited from using the platform. Sometimes, these clips are edited from genuine interviews, with altered voiceovers or synthetic AI-generated speech that mimics the real voices of these leaders.
Once the clip gains traction on platforms like Facebook, Instagram, X (formerly Twitter), TikTok, and YouTube, it directs viewers to sign up on a linked website. Users are then prompted to deposit a “minimum investment” of a few hundred pounds. Shortly after, a representative reaches out, encouraging further investments, often with pressure tactics or fake success stories. It is only after their funds vanish without a trace that victims realize they have been defrauded. There is no shortage of testimonies from Britons who have fallen victim to these scams. There is the case of Margaret, a 64 year-old retired teacher from Leeds, who stumbled upon a video of King Charles III speaking about a “new royal-endorsed financial future for the people of Britain.” Trusting the monarch, she invested £2,000 into what she believed was a safe and patriotic financial decision. Today, not only is the money gone, but she also bears the emotional scars of betrayal and embarrassment.
Prince William and Princess Kate
Or think of John, a 35 year-old warehouse worker from Manchester, who saw a video of Prime Minister Keir Starmer on Facebook, claiming that Quantum AI was “transforming Britain’s economy.” John invested £700, hoping to grow his savings. A week later, his account was inaccessible, and customer service had gone dark. Worse still, he was contacted again by the scammers, posing as financial regulators, demanding more money to “recover” his funds.
These stories are not isolated incidents. Yet, despite the scale of the deceit and the prominent misuse of their identities, public figures like Starmer, Prince William, and King Charles III have made no public statements explicitly denying their association with these scams.
Why are they not saying anything?
Their silence is both puzzling and dangerous for the British society. In the age of viral content and instant information dissemination, a clear, public denial from these figures could be instrumental in protecting vulnerable ones in the nation. Their failure to speak up may not only be interpreted as tacit approval but may also exacerbate public trust in the fraudulent schemes. The monarchy, the Prime Minister’s office, and other public institutions have media arms, communication advisers, and press secretaries capable of issuing immediate and strong denouncements. Yet, their responses have been tepid at best, confined mostly to ambiguous disclaimers buried in official websites or scattered among the footnotes of press releases.
Public figures wield enormous influence, no doubt. And that influence comes with responsibility. In times of crisis or public confusion, silence from leaders becomes complicity. When citizens are being defrauded in their thousands, using the likeness and credibility of those they trust, it is not enough to rely on the law to catch the criminals. There must be a proactive effort to disassociate and inform. A formal press conference, a televised statement, or even a widely publicized social media post from the Prime Minister or the Royal Family could go a long way in alerting the public. When silence reigns, the scammers thrive.
Prime Minister Sir Keir Starmer
From a legal perspective, deepfake technology treads murky waters. While laws on defamation and impersonation exist, enforcement is complex, especially when scammers operate internationally. The platforms hosting these videos often remove them only after they have gone viral, by which time the damage has already been done. Ethically, the issue is graver. When citizens invest not merely their money but their trust in public figures, those figures owe it to the people to protect that trust. The damage done goes beyond the finances: it erodes national confidence in leadership, in the monarchy, and in the media itself. Moreover, the failure to respond adequately to this threat might be interpreted as a systemic failure to adapt to the modern information war. The government has invested in digital literacy and online safety campaigns, yet the most powerful deterrent in this situation remains unutilized: the voices of the figures purportedly involved.
Also, the complicity of social media companies cannot be ignored or swept under the carpet. These platforms profit from engagement, even when that engagement is rooted in abject deceit. While Facebook, YouTube, and others claim to have policies against misinformation, enforcement is often reactive when algorithms prioritize virility over veracity. There have been instances, for example, where deepfake videos remained online for days, even after being reported. Within that time, thousands could have been scammed. And that raises the question: are social media platforms doing enough? Should they not be held accountable for hosting and amplifying content that leads to financial ruin?
It is worth noting that in some countries, regulators are considering or have implemented legislation that makes platforms partially liable for the content they host. The UK’s Online Safety Bill is a step in the right direction, but its implementation must be vigorous and its reach, extensive. While institutional and legal reforms are crucial, public awareness remains the first line of defence. Education campaigns must be intensified to help people spot deepfakes, question too-good-to-be-true financial claims, and verify sources before engaging with any investment.
Painful: Ms Jensen will pay back a £23,000 bank loan invested in Quantum AI for the next 27 years
Financial watchdogs like the FCA (Financial Conduct Authority) and Action Fraud have already released warnings, but these need to be more aggressive and more visible. Collaborations with popular media, influencers, and educational institutions could make the message more accessible. More importantly, the language of such campaigns must be relatable. The elderly, who are often the most susceptible to these scams, require straightforward, jargon-free communication. Younger generations must be taught digital scepticism alongside traditional literacy.
What can then be done to mitigate this growing public danger?
Public figures must come forward to clearly state that they have no connection to these scams. This could be through press releases, media interviews, and social media posts. Government and regulatory agencies must pressure social media platforms to implement stronger AI detection and takedown mechanisms for deepfake content. The UK should strengthen its laws to criminalize the creation and distribution of deepfake content that facilitates fraud. Comprehensive public awareness campaigns, tailored to all age groups and demographics, should be launched to build resistance against online deception. Also, establishing dedicated helplines and financial advisory services for scam victims can provide the emotional and legal support they need.
The silent suffering of Britons who are falling prey to these scams should be a national emergency, not a footnote in public discourse. The reputations of King Charles III, Prince William, and Prime Minister Sir Keir Starmer are being weaponized against their own people. In such circumstances, silence is not a virtue. It is a betrayal. To remain mute while criminals exploit the nation’s trust in its most revered figures is to allow the deception to persist. The time has come for those in positions of influence and authority to speak up, act decisively, and help shield the public from these digital wolves in sheep’s clothing. In the battle against online deception, silence cannot not golden: it can only be fatal.