<p>A few days ago, I read an article about the disruption at Delhi’s airport because of a malfunctioning air traffic control system. Despite reading the article twice, I couldn’t understand either the problem or the solution. The only thing I gathered is that for people like me, the technology in air navigation systems is a black box. If something goes wrong, we will neither understand how it went wrong nor understand what can be done to fix the problem. Vast swathes of our lives are now hostage to the mysteries of technology.</p>.<p>There is an even more serious problem. Innocent people might be blamed for problems created by technology. Today, I discuss one such miscarriage of justice – this case is from the United Kingdom, but the lessons we can learn apply to any place that relies on technology to make its systems more efficient. The UK’s post office system consists of sub-postmasters (SPMs) managing their respective post offices under contract. The UK Post Office deployed a software system called Horizon to keep track of expenses and outcomes in the post offices. Unfortunately, Horizon was prone to glitches, which led to a financial reconciliation problem in many post offices. The blame fell on the SPMs, many of whom were removed from their services. Careers were interrupted and lives were destroyed.</p>.PM Modi urges G20 to share satellite data with developing nations; India, Australia, Canada launch tech innovation ties.<p>The SPMs took the UK Post Office to court. In Bates v Post Office Ltd. (cases decided between 2017 and 2019), the court said the Horizon systems were to blame. The SPMs were exonerated, the UK Post Office head Paula Vennells stepped down, and royal honours bestowed on her were taken away. The government introduced legislation to ensure that the hundreds of unfairly maligned SPMs were provided fair compensation. A made-for-TV docudrama, Mr Bates v The Post Office, was released in 2024 to much acclaim and consternation.</p>.<p>But the problem remains. If technology becomes too difficult to understand but too easy to accept, innocent people could be blamed. In the United States, facial recognition technology helps the police pick people as potential suspects in criminal cases. Once the police start identifying people based on AI, it becomes convenient for them to arrest people on that basis alone and difficult for the arrested to prove their innocence.</p>.<p>Can something be done about the perils of technology? Technology must be understood first before it is harnessed in furtherance of finance or legal system reform. Technology is useful in processing and validating facts, but when it begins making judgments in the guise of determining facts, we are in trouble. There is a difference between gathering and organising information on incarcerated individuals seeking bail and deciding whether granting bail in individual cases is the right thing to do. Technology can dominate the former inquiry but not the latter decision.</p>.<p>There is another solution in navigating the complexities of legal relationships around the use of technology. The Bates case consisted of multiple judgments. In one of the orders, the judge found that the UK Post Office, as a matter of contractual law, did not act in good faith. It wasn’t only about the software malfunctioning; the problem was that the Post Office did not share the extent of its issues with its contractual staff. Normally, there is no requirement to act in good faith in contracts, but the judge said that, in what are termed relational contracts, parties are legally required to act in a cooperative rather than a competitive manner.</p>.<p>Perhaps the Bates case is an occasion for us to think about regulating technology differently. Restraining the adverse impact of technology need not be related directly to the technology itself. It can be controlled through other legal relationships. In the Bates case, it was the mutual contractual obligations that led towards the accountability of the Post Office. In the criminal justice system, where matters such as bail hinge on the application of sophisticated technology, some aspects of administrative law relating to due process and fair hearing can restrain the adverse impact of technological decision-making. For example, a personal hearing ought to be provided to citizens adversely affected by algorithmic decisions.</p>.<p>I think it’s not as much of a problem when technology makes judgment calls in our personal lives. It’s the same technology that fires both bail processing and dating applications. Perhaps it’s up to us to choose if we want to submit our personal lives to the vagaries of technology. But in the public sphere, individuals can’t choose to disable the deployment of technology. We must use technology only after a careful analysis of how it operates. Once we let the technology genie out of the bottle, our wishes may well be granted, but we might regret what we asked for.</p>.<p>The writer is a law professor who thinks that the law is too important to be left to the lawyers.</p><p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>
<p>A few days ago, I read an article about the disruption at Delhi’s airport because of a malfunctioning air traffic control system. Despite reading the article twice, I couldn’t understand either the problem or the solution. The only thing I gathered is that for people like me, the technology in air navigation systems is a black box. If something goes wrong, we will neither understand how it went wrong nor understand what can be done to fix the problem. Vast swathes of our lives are now hostage to the mysteries of technology.</p>.<p>There is an even more serious problem. Innocent people might be blamed for problems created by technology. Today, I discuss one such miscarriage of justice – this case is from the United Kingdom, but the lessons we can learn apply to any place that relies on technology to make its systems more efficient. The UK’s post office system consists of sub-postmasters (SPMs) managing their respective post offices under contract. The UK Post Office deployed a software system called Horizon to keep track of expenses and outcomes in the post offices. Unfortunately, Horizon was prone to glitches, which led to a financial reconciliation problem in many post offices. The blame fell on the SPMs, many of whom were removed from their services. Careers were interrupted and lives were destroyed.</p>.PM Modi urges G20 to share satellite data with developing nations; India, Australia, Canada launch tech innovation ties.<p>The SPMs took the UK Post Office to court. In Bates v Post Office Ltd. (cases decided between 2017 and 2019), the court said the Horizon systems were to blame. The SPMs were exonerated, the UK Post Office head Paula Vennells stepped down, and royal honours bestowed on her were taken away. The government introduced legislation to ensure that the hundreds of unfairly maligned SPMs were provided fair compensation. A made-for-TV docudrama, Mr Bates v The Post Office, was released in 2024 to much acclaim and consternation.</p>.<p>But the problem remains. If technology becomes too difficult to understand but too easy to accept, innocent people could be blamed. In the United States, facial recognition technology helps the police pick people as potential suspects in criminal cases. Once the police start identifying people based on AI, it becomes convenient for them to arrest people on that basis alone and difficult for the arrested to prove their innocence.</p>.<p>Can something be done about the perils of technology? Technology must be understood first before it is harnessed in furtherance of finance or legal system reform. Technology is useful in processing and validating facts, but when it begins making judgments in the guise of determining facts, we are in trouble. There is a difference between gathering and organising information on incarcerated individuals seeking bail and deciding whether granting bail in individual cases is the right thing to do. Technology can dominate the former inquiry but not the latter decision.</p>.<p>There is another solution in navigating the complexities of legal relationships around the use of technology. The Bates case consisted of multiple judgments. In one of the orders, the judge found that the UK Post Office, as a matter of contractual law, did not act in good faith. It wasn’t only about the software malfunctioning; the problem was that the Post Office did not share the extent of its issues with its contractual staff. Normally, there is no requirement to act in good faith in contracts, but the judge said that, in what are termed relational contracts, parties are legally required to act in a cooperative rather than a competitive manner.</p>.<p>Perhaps the Bates case is an occasion for us to think about regulating technology differently. Restraining the adverse impact of technology need not be related directly to the technology itself. It can be controlled through other legal relationships. In the Bates case, it was the mutual contractual obligations that led towards the accountability of the Post Office. In the criminal justice system, where matters such as bail hinge on the application of sophisticated technology, some aspects of administrative law relating to due process and fair hearing can restrain the adverse impact of technological decision-making. For example, a personal hearing ought to be provided to citizens adversely affected by algorithmic decisions.</p>.<p>I think it’s not as much of a problem when technology makes judgment calls in our personal lives. It’s the same technology that fires both bail processing and dating applications. Perhaps it’s up to us to choose if we want to submit our personal lives to the vagaries of technology. But in the public sphere, individuals can’t choose to disable the deployment of technology. We must use technology only after a careful analysis of how it operates. Once we let the technology genie out of the bottle, our wishes may well be granted, but we might regret what we asked for.</p>.<p>The writer is a law professor who thinks that the law is too important to be left to the lawyers.</p><p><em>Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.</em></p>