<p>When Babydoll Archi appeared online – pouting in a scarlet sari and a pink blouse – she captivated 1.4 million overnight. But Babydoll Archi was not real. She was a deepfake – a sexually-explicit persona engineered from a single photograph of a woman – by an embittered ex-lover who targeted her with revenge porn, unleashing the grotesque fantasies of millions. But deepfakes don’t flourish just because technology exists; their strongest driver is the cultural context where women’s safety and dignity are undervalued.</p>.<p>In India, where about 31,000 rapes are reported annually and 1 in 3 married women faces spousal violence, deepfakes find fertile ground. In such contexts, intimacy is often entangled with domination, and fantasy is mistaken for desire, shaping how women ought to be coveted. Deepfakes that digitally graft women’s faces onto nude bodies are a form of digital patriarchy. They strip women from humans into consumable images and from subjects of affection into objects of conquest. They expose the perilous slip to degradation, amplifying the violence women and children live with daily.</p>.<p>The crux of the matter isn’t whether pornography is harmful to women; the answer is yes. The question is: how do we navigate the treacherous waters when the lines between desire and violence blur? As AI churns out content at breakneck speed, the anonymous thirst for novel women is projected to increase by 35% over the next five years, outpacing efforts at regulating it; 98% of its targets will still be women.</p>.<p>This does not mean that all pornography is toxic. When consensual and respectful, it can foster healthy sexual expression. Some argue – since 41-54% of 13-20 year olds can discern deepfakes – that greater exposure to deepfake pornography will breed skepticism. But the reality is that many fall prey to its deception, mistaking manipulated images as authentic, with dire consequences for young minds grappling with their perceptions of relationships and consent. Even those who recognise deepfakes as fake may weaponise them by sharing, ridiculing, or engaging in online mobbing, compounding harms.</p>.<p>The science is: violent pornography fosters toxic beliefs, heightens the risk of sexual violence, and normalises hostility towards women. When porn aggressive to women is consumed in high-misogyny contexts, it catalyses their debasement and makes it difficult for women to escape abusive relationships. Violent fantasies, especially those fuelled by misogyny, are predictive and potent risk factors for real-world sexual aggression.</p>.<p>As a psychologist, I have treated women affected by violations such as molestation, rape, and incest who carry scars like shame, panic, and the corrosive belief that their personhood can be rewritten by others. Survivors describe deepfakes as re-traumatisation, an echo of earlier nonconsensual experiences, where wanting is, in fact, taking. For women, this dynamic does more than endanger them physically – it can trap them in cycles of abuse and vulnerability.</p>.<p>While other countries are advancing legal remedies, India’s failure is rooted not in technology but in history. The US Supreme Court upheld age-verification laws for websites, and the UK’s Online Safety Act mandates age checks on adult content. India’s lag can be located in the weight of history borne by its women. Colonial and caste histories normalised the control over women’s bodies, granting upper-caste men unchecked access to lower-caste women. AI industrialises these abuses: doxxing and mock auctions of Muslim women like Sulli Deals reproduce the dynamics of domination and fear on a global and permanent scale.</p>.<p>Legislative measures alone will not be enough; a significant cultural reckoning is necessary to dismantle the dehumanisation of women. It can prompt questions about our complicity, such as: What does it mean to consume pornography in a world where women’s identities can be maliciously reconstructed and sold as fantasies?</p>.<p>The Information Technology Act, the Bharatiya Nyaya Sanhita, and the Digital Personal Data Protection Act provide limited remedies, although none criminalise deepfake nudes. Future remedies may include: ensuring verified access within a traceable ecosystem; incentivising platforms to remove non-consensual images with haste; establishing fast-track cybercrime courts; creating dedicated benches to ensure that trials finish within six months to minimise the trauma associated with prolonged litigation; and implementing accountability measures for online platforms.</p>.<p>It is not enough to hold the creators accountable but to ask soul-searching questions of ourselves: What does it mean to be a consumer of porn in an age where women’s faces are stitched into fantasies without consent? Much like it did to the real Archi, deepfakes destroy a woman’s confidence and mutate the sexual act into one of her debasement.</p>.<p>Deepfake porn collapses the line between looking and doing harm to women. When anonymity becomes an engine of violence, it dissolves women’s humanity into performance, collapses intimacy into a spectacle, and reimagines them as consumables. In this sense, to watch is not only to witness – it is to wound.</p>.<p><em>The writer is an international psychologist, former professor, and writer on culture, cosmopolitanism, and global affairs.</em></p>
<p>When Babydoll Archi appeared online – pouting in a scarlet sari and a pink blouse – she captivated 1.4 million overnight. But Babydoll Archi was not real. She was a deepfake – a sexually-explicit persona engineered from a single photograph of a woman – by an embittered ex-lover who targeted her with revenge porn, unleashing the grotesque fantasies of millions. But deepfakes don’t flourish just because technology exists; their strongest driver is the cultural context where women’s safety and dignity are undervalued.</p>.<p>In India, where about 31,000 rapes are reported annually and 1 in 3 married women faces spousal violence, deepfakes find fertile ground. In such contexts, intimacy is often entangled with domination, and fantasy is mistaken for desire, shaping how women ought to be coveted. Deepfakes that digitally graft women’s faces onto nude bodies are a form of digital patriarchy. They strip women from humans into consumable images and from subjects of affection into objects of conquest. They expose the perilous slip to degradation, amplifying the violence women and children live with daily.</p>.<p>The crux of the matter isn’t whether pornography is harmful to women; the answer is yes. The question is: how do we navigate the treacherous waters when the lines between desire and violence blur? As AI churns out content at breakneck speed, the anonymous thirst for novel women is projected to increase by 35% over the next five years, outpacing efforts at regulating it; 98% of its targets will still be women.</p>.<p>This does not mean that all pornography is toxic. When consensual and respectful, it can foster healthy sexual expression. Some argue – since 41-54% of 13-20 year olds can discern deepfakes – that greater exposure to deepfake pornography will breed skepticism. But the reality is that many fall prey to its deception, mistaking manipulated images as authentic, with dire consequences for young minds grappling with their perceptions of relationships and consent. Even those who recognise deepfakes as fake may weaponise them by sharing, ridiculing, or engaging in online mobbing, compounding harms.</p>.<p>The science is: violent pornography fosters toxic beliefs, heightens the risk of sexual violence, and normalises hostility towards women. When porn aggressive to women is consumed in high-misogyny contexts, it catalyses their debasement and makes it difficult for women to escape abusive relationships. Violent fantasies, especially those fuelled by misogyny, are predictive and potent risk factors for real-world sexual aggression.</p>.<p>As a psychologist, I have treated women affected by violations such as molestation, rape, and incest who carry scars like shame, panic, and the corrosive belief that their personhood can be rewritten by others. Survivors describe deepfakes as re-traumatisation, an echo of earlier nonconsensual experiences, where wanting is, in fact, taking. For women, this dynamic does more than endanger them physically – it can trap them in cycles of abuse and vulnerability.</p>.<p>While other countries are advancing legal remedies, India’s failure is rooted not in technology but in history. The US Supreme Court upheld age-verification laws for websites, and the UK’s Online Safety Act mandates age checks on adult content. India’s lag can be located in the weight of history borne by its women. Colonial and caste histories normalised the control over women’s bodies, granting upper-caste men unchecked access to lower-caste women. AI industrialises these abuses: doxxing and mock auctions of Muslim women like Sulli Deals reproduce the dynamics of domination and fear on a global and permanent scale.</p>.<p>Legislative measures alone will not be enough; a significant cultural reckoning is necessary to dismantle the dehumanisation of women. It can prompt questions about our complicity, such as: What does it mean to consume pornography in a world where women’s identities can be maliciously reconstructed and sold as fantasies?</p>.<p>The Information Technology Act, the Bharatiya Nyaya Sanhita, and the Digital Personal Data Protection Act provide limited remedies, although none criminalise deepfake nudes. Future remedies may include: ensuring verified access within a traceable ecosystem; incentivising platforms to remove non-consensual images with haste; establishing fast-track cybercrime courts; creating dedicated benches to ensure that trials finish within six months to minimise the trauma associated with prolonged litigation; and implementing accountability measures for online platforms.</p>.<p>It is not enough to hold the creators accountable but to ask soul-searching questions of ourselves: What does it mean to be a consumer of porn in an age where women’s faces are stitched into fantasies without consent? Much like it did to the real Archi, deepfakes destroy a woman’s confidence and mutate the sexual act into one of her debasement.</p>.<p>Deepfake porn collapses the line between looking and doing harm to women. When anonymity becomes an engine of violence, it dissolves women’s humanity into performance, collapses intimacy into a spectacle, and reimagines them as consumables. In this sense, to watch is not only to witness – it is to wound.</p>.<p><em>The writer is an international psychologist, former professor, and writer on culture, cosmopolitanism, and global affairs.</em></p>