×
ADVERTISEMENT
ADVERTISEMENT

Facade of deepfakes: How deeply fake can things get?

Deepfakes are deeply affecting the way we "see" and identify truth. They create a facade which is tough to question. Top cybersecurity experts tell you how one can stay safe and battle the puzzling reality artificially created by deepfakes.
Last Updated : 23 March 2026, 08:37 IST
ADVERTISEMENT

Follow Us :

Comments
<div class="paragraphs"><p>Credit: iStock</p></div>

Credit: iStock

Quick summary - click for full details
Concise summary of key highlights

Facade of deepfakes: How deeply fake can things get?

In one line
Deepfakes blur reality with AI-generated media, challenging trust and requiring new detection and regulatory approaches.
Deepfakes undermine trust
AI-generated videos and images are becoming indistinguishable from reality, eroding the foundation of visual and auditory trust in media.
Rapid creation tools
Open-source tools now allow anyone with basic tech skills to create convincing deepfakes in hours using scraped social media content.
Widespread risks
From reputational damage to fraud, deepfakes pose threats at individual, organizational, and societal levels, including 'deepfake job applicants'.
Detection challenges
Even experts struggle to identify deepfakes as technology advances, with subtle inconsistencies disappearing in high-quality fakes.
Regulatory and cultural shifts
Governments and platforms are beginning to act, but a zero-trust mindset and digital authentication frameworks are needed for long-term solutions.
MeitY advisory in India (2023)
Regulatory action timeline
Open-source tools enabling creation with minimal effort
Barrier to entry collapse
Organizations unprepared for synthetic impersonation in hiring
Deepfake job applicant risk
Processed with AI. Reviewed by DH Digital Team.
<div class="paragraphs"><p>Credit: iStock</p></div>

Credit: iStock

But the threat I find most underappreciated right now and one that I think businesses are dangerously underprepared for is ‘deepfake job applicants’. It sounds like something out of a Black Mirror episode, but it's happening. Someone uses a synthetic video persona and a cloned voice to pass a remote job interview, gets hired, and then has legitimate access to your systems and data. From the inside, they can do enormous damage quietly and over time.
Aaron Bugal, Field CISO, APJ at Sophos
<div class="paragraphs"><p>Credit: iStock</p></div>

Credit: iStock

ADVERTISEMENT
Published 23 March 2026, 08:37 IST

Follow us on :

Follow Us