<p>What does it mean when state governments attempt to regulate technologies they only partially understand? What happens when policy enthusiasm outruns technological capacity? And when the state seeks to protect children from the digital world, does it risk building rules that the Internet itself will quietly bypass?</p><p>These questions are worth asking as governments begin to respond to rising concern over the influence of social media on young people. Karnataka’s Budget proposal to restrict social media access for those under 16 reflects a growing political instinct across many countries. Policymakers are increasingly uneasy about how algorithmic platforms shape childhood. Endless scrolling, compulsive engagement loops, and the pressure of digital validation have begun to redefine how young people spend their time, and how they see themselves.</p><p>In that sense, the concern from governments is entirely legitimate. The digital ecosystem surrounding children today is radically different from the one that existed even a decade ago. Platforms designed to capture attention now compete aggressively for the cognitive bandwidth of adolescents whose ability to regulate impulses is still developing. The consequences are visible in rising anxiety, shrinking attention spans, and the erosion of quiet, uninterrupted thinking.</p><p>There is also a deeper political temptation at play. Social media has become the most visible symbol of digital excess. Criticising it attracts public approval and creates the appearance of decisive intervention. Public concern does not automatically produce thoughtful regulation. Faced with complex technologies, the political instinct of the State is often to reach for the administrative equivalent of a hammer. What follows is rarely policy. It is often a quiet admission of limited supervisory capability.</p>.Social media ban: Meta flags risks in Karnataka's plan for users under-16 .<p>The Internet rarely obeys political rules drawn along state borders. Before turning to the limits of enforcement, another uncomfortable truth deserves acknowledgement. Governments are not the primary custodians of childhood. Parents are.</p><p><strong>Parenting in the age of the digital nanny</strong></p><p>Over the past decade, many households have quietly allowed digital devices to fill the spaces once occupied by parental attention. Smartphones calm restless toddlers. Tablets keep children occupied during meals. Video platforms substitute for supervision when time or patience runs short. In countless homes and schools, the Internet has evolved into a convenient digital caretaker. The same generation that now applauds regulatory intervention has often been willing participants in this quiet delegation of parenting to screens.</p><p>Children did not construct the digital environment that surrounds them. Adults did.</p><p>The risks of excessive exposure to social media are by now widely recognised. Young users inhabit platforms where comparison is constant, approval is quantified, and attention is relentlessly contested. The psychological pressure created by these environments can be significant, particularly for adolescents navigating identity and belonging. Governments, therefore, have every reason to examine how digital platforms shape childhood.</p><p><strong>Limits of bans in a borderless Internet</strong></p><p>The architecture of the Internet does not align neatly with political boundaries. A rule imposed within one jurisdiction often encounters immediate technical complications.</p><p>A State attempting to block social media access for users below a certain age would need reliable ways of identifying both the location and the identity of those users. Neither task is straightforward. Platforms can attempt to infer location through network signals, but such methods are frequently inaccurate. When neighbouring states follow different regulatory approaches, the digital border between them becomes even harder to define.</p><p>Age verification presents an even greater challenge. To determine whether a user is below a certain age, platforms would likely need to verify identity through official documents or other forms of authentication. That process could encourage the collection of sensitive personal data from millions of citizens. A policy designed to protect minors could inadvertently expand digital surveillance for everyone.</p><p>Even if these hurdles were overcome, behavioural reality would remain unchanged. Young people are rarely deterred by technical restrictions. Virtual private networks, incorrect age declarations or simply accessing services through another device would quickly dilute the effectiveness of any ban. Experience across the world suggests that digital prohibitions often shift behaviour rather than eliminate it.</p><p>In practice, a ban could simply drive teenagers toward less visible and less moderated corners of the Internet.</p><p><strong>Architecture of addiction</strong></p><p>There is also a deeper problem that bans fail to address. The central issue is not merely that social media exists. It is that many platforms are engineered to maximise engagement in ways that are difficult for young minds to resist.</p><p>Algorithmic feeds continuously learn which content prolongs attention. Short video formats deliver rapid bursts of stimulation that encourage repeated viewing. Notifications interrupt daily life with small prompts that draw users back into the platform. These mechanisms form a behavioural system that rewards compulsive interaction.</p><p>In other words, the addictive qualities of social media are structural features of the platforms themselves. If governments genuinely wish to protect young users, regulatory attention must therefore focus on platform design rather than access alone. Questions about algorithmic amplification, engagement incentives and age-appropriate digital environments deserve far greater scrutiny than blanket prohibitions.</p><p>Adolescence is a period during which individuals develop judgment and autonomy. Learning how to navigate the online world responsibly may require guided exposure rather than complete exclusion. Digital literacy, like many other life skills, often develops through experience combined with supervision.</p><p><strong>Federal authority and the future of regulation</strong></p><p>There is another dimension to this debate that deserves careful attention. The Internet functions as national and global infrastructure. In India, regulatory authority over telecommunications and digital networks largely rests with the Union government. This raises legitimate questions about whether individual states possess the jurisdiction to impose independent restrictions on digital platforms.</p><p>Such measures may ultimately face judicial scrutiny. Courts will then be asked to weigh child protection concerns against questions of constitutional authority and technological feasibility.</p><p>Equally important is the process through which digital regulation emerges. Policies that affect millions of citizens should be shaped through consultation with technologists, educators, civil society and industry. Quick announcements may generate political momentum but often leave administrators confronting practical challenges that were never fully considered.</p><p>None of this diminishes the seriousness of the underlying concern. Childhood is being reshaped by digital systems that were never designed with the psychological needs of young users in mind. Governments are right to recognise that problem and to search for solutions.</p><p>But meaningful regulation in the digital age requires precision, technical literacy and institutional patience.</p><p>The real challenge before policymakers is not to construct prohibitions that the internet will quietly circumvent. It is to reshape the incentives and architectures that currently reward addictive behaviour and convert human attention into an endlessly exploitable commodity.</p><p>Until policy reaches that deeper layer of the digital ecosystem, bans will remain what they often become in complex technological environments.</p><p>Declarations of intent that the world soon learns to ignore.</p><p><em>Srinath Sridharan is a corporate adviser and independent director on corporate boards. X: @ssmumbai.</em></p><p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>
<p>What does it mean when state governments attempt to regulate technologies they only partially understand? What happens when policy enthusiasm outruns technological capacity? And when the state seeks to protect children from the digital world, does it risk building rules that the Internet itself will quietly bypass?</p><p>These questions are worth asking as governments begin to respond to rising concern over the influence of social media on young people. Karnataka’s Budget proposal to restrict social media access for those under 16 reflects a growing political instinct across many countries. Policymakers are increasingly uneasy about how algorithmic platforms shape childhood. Endless scrolling, compulsive engagement loops, and the pressure of digital validation have begun to redefine how young people spend their time, and how they see themselves.</p><p>In that sense, the concern from governments is entirely legitimate. The digital ecosystem surrounding children today is radically different from the one that existed even a decade ago. Platforms designed to capture attention now compete aggressively for the cognitive bandwidth of adolescents whose ability to regulate impulses is still developing. The consequences are visible in rising anxiety, shrinking attention spans, and the erosion of quiet, uninterrupted thinking.</p><p>There is also a deeper political temptation at play. Social media has become the most visible symbol of digital excess. Criticising it attracts public approval and creates the appearance of decisive intervention. Public concern does not automatically produce thoughtful regulation. Faced with complex technologies, the political instinct of the State is often to reach for the administrative equivalent of a hammer. What follows is rarely policy. It is often a quiet admission of limited supervisory capability.</p>.Social media ban: Meta flags risks in Karnataka's plan for users under-16 .<p>The Internet rarely obeys political rules drawn along state borders. Before turning to the limits of enforcement, another uncomfortable truth deserves acknowledgement. Governments are not the primary custodians of childhood. Parents are.</p><p><strong>Parenting in the age of the digital nanny</strong></p><p>Over the past decade, many households have quietly allowed digital devices to fill the spaces once occupied by parental attention. Smartphones calm restless toddlers. Tablets keep children occupied during meals. Video platforms substitute for supervision when time or patience runs short. In countless homes and schools, the Internet has evolved into a convenient digital caretaker. The same generation that now applauds regulatory intervention has often been willing participants in this quiet delegation of parenting to screens.</p><p>Children did not construct the digital environment that surrounds them. Adults did.</p><p>The risks of excessive exposure to social media are by now widely recognised. Young users inhabit platforms where comparison is constant, approval is quantified, and attention is relentlessly contested. The psychological pressure created by these environments can be significant, particularly for adolescents navigating identity and belonging. Governments, therefore, have every reason to examine how digital platforms shape childhood.</p><p><strong>Limits of bans in a borderless Internet</strong></p><p>The architecture of the Internet does not align neatly with political boundaries. A rule imposed within one jurisdiction often encounters immediate technical complications.</p><p>A State attempting to block social media access for users below a certain age would need reliable ways of identifying both the location and the identity of those users. Neither task is straightforward. Platforms can attempt to infer location through network signals, but such methods are frequently inaccurate. When neighbouring states follow different regulatory approaches, the digital border between them becomes even harder to define.</p><p>Age verification presents an even greater challenge. To determine whether a user is below a certain age, platforms would likely need to verify identity through official documents or other forms of authentication. That process could encourage the collection of sensitive personal data from millions of citizens. A policy designed to protect minors could inadvertently expand digital surveillance for everyone.</p><p>Even if these hurdles were overcome, behavioural reality would remain unchanged. Young people are rarely deterred by technical restrictions. Virtual private networks, incorrect age declarations or simply accessing services through another device would quickly dilute the effectiveness of any ban. Experience across the world suggests that digital prohibitions often shift behaviour rather than eliminate it.</p><p>In practice, a ban could simply drive teenagers toward less visible and less moderated corners of the Internet.</p><p><strong>Architecture of addiction</strong></p><p>There is also a deeper problem that bans fail to address. The central issue is not merely that social media exists. It is that many platforms are engineered to maximise engagement in ways that are difficult for young minds to resist.</p><p>Algorithmic feeds continuously learn which content prolongs attention. Short video formats deliver rapid bursts of stimulation that encourage repeated viewing. Notifications interrupt daily life with small prompts that draw users back into the platform. These mechanisms form a behavioural system that rewards compulsive interaction.</p><p>In other words, the addictive qualities of social media are structural features of the platforms themselves. If governments genuinely wish to protect young users, regulatory attention must therefore focus on platform design rather than access alone. Questions about algorithmic amplification, engagement incentives and age-appropriate digital environments deserve far greater scrutiny than blanket prohibitions.</p><p>Adolescence is a period during which individuals develop judgment and autonomy. Learning how to navigate the online world responsibly may require guided exposure rather than complete exclusion. Digital literacy, like many other life skills, often develops through experience combined with supervision.</p><p><strong>Federal authority and the future of regulation</strong></p><p>There is another dimension to this debate that deserves careful attention. The Internet functions as national and global infrastructure. In India, regulatory authority over telecommunications and digital networks largely rests with the Union government. This raises legitimate questions about whether individual states possess the jurisdiction to impose independent restrictions on digital platforms.</p><p>Such measures may ultimately face judicial scrutiny. Courts will then be asked to weigh child protection concerns against questions of constitutional authority and technological feasibility.</p><p>Equally important is the process through which digital regulation emerges. Policies that affect millions of citizens should be shaped through consultation with technologists, educators, civil society and industry. Quick announcements may generate political momentum but often leave administrators confronting practical challenges that were never fully considered.</p><p>None of this diminishes the seriousness of the underlying concern. Childhood is being reshaped by digital systems that were never designed with the psychological needs of young users in mind. Governments are right to recognise that problem and to search for solutions.</p><p>But meaningful regulation in the digital age requires precision, technical literacy and institutional patience.</p><p>The real challenge before policymakers is not to construct prohibitions that the internet will quietly circumvent. It is to reshape the incentives and architectures that currently reward addictive behaviour and convert human attention into an endlessly exploitable commodity.</p><p>Until policy reaches that deeper layer of the digital ecosystem, bans will remain what they often become in complex technological environments.</p><p>Declarations of intent that the world soon learns to ignore.</p><p><em>Srinath Sridharan is a corporate adviser and independent director on corporate boards. X: @ssmumbai.</em></p><p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>