Back Home > Research > The Appeal of Machine Justice
November 2021   |   Volume 23 No. 1

The Appeal of Machine Justice

A litigation services terminal in the case acceptance division of the Shanghai No.2 Intermediate People’s Court.
Jurisdictions around the world are increasingly using artificial intelligence (AI) to help mete out justice, nowhere more so than in Mainland China. Legal scholar Dr Benjamin Chen has been studying the drivers and implications of this trend.

The Chinese judicial system has put millions of case judgements online and installed machines in courthouses that tell would-be litigants their chances of success. In one sense, this puts legal matters in the hands of ordinary citizens. But at the same time, AI-mediated justice also serves the needs of the Chinese Communist Party (CCP), says Dr Benjamin Chen of the Faculty of Law.

Dr Chen and his collaborator Dr Li Zhiyu of Durham University have explored the implications behind the fervent embrace of AI and information technology in the Chinese legal system. “We’re not saying that technology is being intentionally deployed to achieve the results we describe, only that technology has grown out to fill gaps and respond to tensions latent in the Chinese legal order,” he said.

They have identified three such tensions where an AI solution has appeal. The first is quite simple: it addresses the heavy caseload that has arisen from government efforts to professionalise the courts by weeding out under-qualified judges. “In the past, just about anyone could be appointed a judge,” he said. “Judicial reforms have culled the ranks but that means there are fewer judges and they are increasingly overworked. Technology is supposed to alleviate this.”

The second tension relates to strains between social harmony and the law. Law-based order was restored following the Cultural Revolution. But the insistence on legal rights and procedure sometimes worsened social conflicts rather than resolving them, so the government started promoting social harmony and encouraging mediation rather than adjudication. However, the judges were also given quotas, which incentivised them to badger and bully parties into settlement.

Deterring self-help

The machines outside courts address the tension between social harmony and rule of law by giving people a seemingly more objective reading of their chances of success if they litigate, based on an algorithmic analysis of previous judgments.

“The official narrative for why the machines are there is access to justice. But the technology also encourages parties to settle by bargaining in the shadow of the law. This responds to a deeper problem confronting the Chinese legal system in which the party-state wants to operate through law, but it does not want people to be too litigious and it wants to maintain social stability,” he said.

The third tension relates to party hegemony. “In very simple terms, law is very useful for deterring self-help and channelling disputes off the streets and into the courts. The CCP has done a lot to raise legal consciousness among people. But the flip side is that while the party-state encourages people to invoke their rights, it does not want them banding together. If they become organised, they represent more of a challenge to the party’s control over social discourse.

“This is a very interesting area where technology comes in. We’re not claiming it is intentional, but one consequence of making law accessible to the masses is that it removes the need to consult lawyers, NGOs, friends and family members, which might bring people together and create a kind of group consciousness.”

Big data statistics platform

The digitisation of the Chinese judicial system has made judicial documents more transparent and accessible to ordinary citizens. The Hangzhou Internet Court Real-Time Big Data Statistics Platform is one example.

Double-edged

Dr Chen and Dr Li tested the appeal of legal technology in an online survey of about 1,000 netizens in China and interviewed 100 prospective litigants who had sought legal aid. Netizens were generally aware of the digitisation of the legal system and thought it could improve its legitimacy. The legal aid seekers were less aware but were enthusiastic about giving the technology a try. However, in both groups there was still a preference for human advice.

“The use of technology in the legal system can be double-edged. On the one hand it expands access to justice which is a kind of democratisation. On the other hand, lawyers may be cut out of the process,” he said, which has broader implications. “Because lawyers are agents of change in the law, the extent to which they are disintermediated could rather profoundly change how the legal system works.

“This is a concern not only for the use of legal technology in China, but more broadly, if we are going to think 50 years ahead about legal systems as a whole.”

In any case, the adoption of algorithms in the justice system will depend largely on people’s faith that they are fair. Dr Chen is also putting finishing touches to an experimental study in the US that asks what ordinary citizens think about machine justice. It found human judges had a procedural justice advantage over AI, but if people were allowed a hearing before an AI judge whose decisions were also interpretable, then the perceived fairness of the process was equivalent to an un-interpretable decision by a human judge handed down without a hearing.

“One note of caution about studies of lay perception is that people’s beliefs can be wrong and machine justice could actually be unfair. But people like to feel they are being heard and that is something we should take into account when imagining the future of adjudication,” he said.

We’re not claiming it is intentional, but one consequence of making law accessible to the masses is that it removes the need to consult lawyers, NGOs, friends and family members, which might bring people together and create a kind of group consciousness.

Portrait

DR BENJAMIN CHEN