<p>On March 25, a Los Angeles jury handed down a significant verdict in favour of a young woman who sued Meta and YouTube. The jury found that the companies were negligent in the design of their platforms, knew their design was dangerous, failed to warn of those risks, and caused substantial harm to the plaintiff.</p><p>The verdict may mark an inflection point in design choices by technology platforms. For years, concerns around social media have focused on outcomes. Excessive screen time. Anxiety, especially among younger users. The sense of being unable to disengage. What this judgment does differently is shift the focus from outcomes to design.</p><p>The argument accepted by the jury is that aspects of the platforms were designed in ways that made that harm more likely. It moves the conversation from user behaviour to system design, and from individual responsibility to shared responsibility. Design choices are no longer being seen as neutral.</p>.Bengaluru: Legal opinion sought to resolve dispute over 120 acres of Kadugodi forest.<p><strong>How platforms are designed to keep us hooked</strong></p><p>None of the individual features used by platforms is unusual on its own. In fact, many feel intuitive, even helpful. Endless scrolling removes the need to click through pages. You do not reach the end of a feed. Videos play automatically, so the experience feels seamless. Notifications bring you back to content you might otherwise miss. Algorithms tailor feeds to what holds your attention. With all this, disengaging takes conscious effort.</p><p>Each of these decisions can be justified independently. But taken together, they create a system with no natural stopping point. It is a system that is designed to sustain attention.</p><p>When outcomes are designed into the system</p><p>We tend to explain outcomes in terms of individual choices. People spend too much time online, get distracted, and lack discipline. That framing is not entirely wrong, but it is incomplete. Behaviour does not occur in isolation. It is shaped by the environment in which choices are made. And that environment, in many cases, is deliberately designed.</p><p>This is not unique to technology. In organisations, incentive structures influence behaviour more powerfully than stated values. In financial services, product design, and sales targets can drive outcomes that no one explicitly intended but are entirely predictable in hindsight. Leaders are often surprised by outcomes that are, in fact, embedded in the system they have built.</p><p>Social media platforms make this dynamic more visible, because the scale is so large, and the feedback loops are so immediate.</p><p>The quiet role of metrics</p><p>Behind every design choice sits a metric. Platforms optimise for time spent, engagement, and frequency of return. These are not unreasonable objectives for businesses built on advertising or content distribution. But metrics are not neutral. They shape decisions. If time spent is the goal, removing stopping cues becomes logical. If engagement is the goal, emotionally charged content will tend to surface more often. If the frequency of return matters, notifications will be calibrated to draw users back. Over time, these choices compound.</p><p>What begins as optimisation becomes a system that nudges behaviour in one direction. Less often do we examine how metrics shape the experience of those outside the organisation.</p><p><strong>Where should responsibility lie?</strong></p><p>It would be easy to take a binary position. Either users are responsible for their own behaviour, or platforms are responsible for shaping that behaviour. The reality, however, is more complex.</p><p>Individuals have agency. People make choices about how they spend their time, and users are not passive. Awareness of platform dynamics has grown considerably. Screen time controls, opt-out options, and content filters are now standard features on most devices. We cannot dismiss that many people have indeed become more deliberate about how they engage.</p><p>At the same time, people are not making these choices in a neutral environment. The system is designed to influence behaviour. The more deliberate that design, the less we can attribute outcomes only to individual decisions. Product design and individual choices affect outcomes, and the recent judgment signals that the balance is being reconsidered.</p><p><strong>The need for guardrails</strong></p><p>It would be a mistake to see this as a debate limited to social media. Any system that operates at scale and shapes human behaviour raises similar questions. How do compensation structures influence risk-taking? How do public policies shape people’s behaviour, often in unintended ways? The answer has not been to abandon the system, but to build guardrails into it.</p><p>That is the work that remains to be done with technology. The benefits are real and, in many contexts — education, healthcare, financial inclusion — potentially transformative. But the rush to capture those benefits has often meant accepting the design choices of platforms largely on their terms. The more deeply technology is embedded in how people live and work, the more consequential its design choices become — and the less we can afford to treat them as someone else's problem.</p><p><em>(Roopa Kudva headed Crisil and ONI.)</em></p><p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>
<p>On March 25, a Los Angeles jury handed down a significant verdict in favour of a young woman who sued Meta and YouTube. The jury found that the companies were negligent in the design of their platforms, knew their design was dangerous, failed to warn of those risks, and caused substantial harm to the plaintiff.</p><p>The verdict may mark an inflection point in design choices by technology platforms. For years, concerns around social media have focused on outcomes. Excessive screen time. Anxiety, especially among younger users. The sense of being unable to disengage. What this judgment does differently is shift the focus from outcomes to design.</p><p>The argument accepted by the jury is that aspects of the platforms were designed in ways that made that harm more likely. It moves the conversation from user behaviour to system design, and from individual responsibility to shared responsibility. Design choices are no longer being seen as neutral.</p>.Bengaluru: Legal opinion sought to resolve dispute over 120 acres of Kadugodi forest.<p><strong>How platforms are designed to keep us hooked</strong></p><p>None of the individual features used by platforms is unusual on its own. In fact, many feel intuitive, even helpful. Endless scrolling removes the need to click through pages. You do not reach the end of a feed. Videos play automatically, so the experience feels seamless. Notifications bring you back to content you might otherwise miss. Algorithms tailor feeds to what holds your attention. With all this, disengaging takes conscious effort.</p><p>Each of these decisions can be justified independently. But taken together, they create a system with no natural stopping point. It is a system that is designed to sustain attention.</p><p>When outcomes are designed into the system</p><p>We tend to explain outcomes in terms of individual choices. People spend too much time online, get distracted, and lack discipline. That framing is not entirely wrong, but it is incomplete. Behaviour does not occur in isolation. It is shaped by the environment in which choices are made. And that environment, in many cases, is deliberately designed.</p><p>This is not unique to technology. In organisations, incentive structures influence behaviour more powerfully than stated values. In financial services, product design, and sales targets can drive outcomes that no one explicitly intended but are entirely predictable in hindsight. Leaders are often surprised by outcomes that are, in fact, embedded in the system they have built.</p><p>Social media platforms make this dynamic more visible, because the scale is so large, and the feedback loops are so immediate.</p><p>The quiet role of metrics</p><p>Behind every design choice sits a metric. Platforms optimise for time spent, engagement, and frequency of return. These are not unreasonable objectives for businesses built on advertising or content distribution. But metrics are not neutral. They shape decisions. If time spent is the goal, removing stopping cues becomes logical. If engagement is the goal, emotionally charged content will tend to surface more often. If the frequency of return matters, notifications will be calibrated to draw users back. Over time, these choices compound.</p><p>What begins as optimisation becomes a system that nudges behaviour in one direction. Less often do we examine how metrics shape the experience of those outside the organisation.</p><p><strong>Where should responsibility lie?</strong></p><p>It would be easy to take a binary position. Either users are responsible for their own behaviour, or platforms are responsible for shaping that behaviour. The reality, however, is more complex.</p><p>Individuals have agency. People make choices about how they spend their time, and users are not passive. Awareness of platform dynamics has grown considerably. Screen time controls, opt-out options, and content filters are now standard features on most devices. We cannot dismiss that many people have indeed become more deliberate about how they engage.</p><p>At the same time, people are not making these choices in a neutral environment. The system is designed to influence behaviour. The more deliberate that design, the less we can attribute outcomes only to individual decisions. Product design and individual choices affect outcomes, and the recent judgment signals that the balance is being reconsidered.</p><p><strong>The need for guardrails</strong></p><p>It would be a mistake to see this as a debate limited to social media. Any system that operates at scale and shapes human behaviour raises similar questions. How do compensation structures influence risk-taking? How do public policies shape people’s behaviour, often in unintended ways? The answer has not been to abandon the system, but to build guardrails into it.</p><p>That is the work that remains to be done with technology. The benefits are real and, in many contexts — education, healthcare, financial inclusion — potentially transformative. But the rush to capture those benefits has often meant accepting the design choices of platforms largely on their terms. The more deeply technology is embedded in how people live and work, the more consequential its design choices become — and the less we can afford to treat them as someone else's problem.</p><p><em>(Roopa Kudva headed Crisil and ONI.)</em></p><p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>