Narayana Murthy Alerts Public to Deepfake Scam, Denies Endorsing Trading Apps Amidst False Claims

Recent warning on X (formerly Twitter), Infosys founder Narayana Murthy cautioned the public about the presence of deepfake images and videos circulating on social media falsely depicting him endorsing trading applications. Murthy urged users not to be misled by such misinformation and emphasized the importance of reporting such instances to the relevant authorities.

In a series of tweets, Narayana Murthy addressed the issue, shedding light on the deceptive nature of some web pages that falsely asserted his investment in automated trading applications. He refuted any involvement in such activities and expressed concern about the spread of fake news through social media and deceptive websites.

Murthy stated, “In recent months, there have been several fake news items propagated via social media apps and on various webpages available on the internet, claiming that I have endorsed or invested in automated trading applications named BTC AI Evex, British Bitcoin Profit, Bit Lyte Sync, Immediate Momentum, Capitalix Ventures, etc.”

He highlighted that these misleading news items often appear on fraudulent websites posing as popular news platforms. Some of these sites even go to the extent of publishing fake interviews using deepfake pictures and videos. Deepfake technology allows for the creation of highly convincing, but entirely fabricated, multimedia content by superimposing one person's likeness onto another.


"I categorically deny any endorsement, relation, or association with these applications or websites. I caution the public not to fall prey to the content of these malicious sites and to the products or services they are trying to sell," Murthy emphasized.

The tech visionary urged people to remain vigilant and report any instances of such misleading information to the appropriate regulatory authorities. His plea aligns with a growing concern over the rise of deepfake content, which has the potential to deceive the public by creating realistic but entirely fictional portrayals of individuals.

This warning comes at a time when the Indian government is taking steps to address the issue of deepfakes. Minister for Electronics and Information Technology Ashwini Vaishnaw has announced plans to establish rules governing deepfakes, which are synthetic media that mimic authentic images, video, and audio. The government aims to formulate regulations that prevent the dissemination of deepfakes online.

The prevalence of deepfakes has become a global concern, with instances of public figures, celebrities, and ordinary individuals falling victim to manipulated content that can damage reputations and spread false information. In November, a deepfake of actor Rashmika Mandanna went viral on Instagram, prompting increased scrutiny of the technology's impact on society.

While the IT Ministry has previously issued advisories to social media platforms warning against online impersonation, the recent developments indicate a heightened focus on comprehensive measures to combat the spread of deepfake content. Narayana Murthy's cautionary message serves as a timely reminder for individuals to exercise caution and verify information in the age of rapidly advancing digital technologies.