Lawyers outraged over use of AI in courts

13 Apr, 2022 16:16 / Updated 3 years ago
Using artificial intelligence in court cases is “unconstitutional” say Malaysian lawyers

Malaysian lawyers say the use of an AI system in the country’s justice system is “unconstitutional” and claim that no one really understands how it works. That’s after courts in two Malaysian states launched a test program to use AI to assist judges in delivering sentences for convicted drug dealers and rapists.

The AI software, developed by state government firm Sarawak Information Systems, was first introduced in 2020 to two courts in Sabah and Sarawak on the island of Borneo as part of a pilot scheme to examine the efficiency of artificial intelligence in sentencing recommendations. The test was set to end in April 2022.

The court in Sabah was the first in the country to use AI to help deliver a court sentence when it convicted two men for drug possession in 2020. However, Hadid Ismail – a lawyer with 20 years of experience who represented the defendants – took issue with the sentence, claiming that the system was being used before judges, lawyers and the public even got a chance to fully understand it and the way it worked.

“Our Criminal Procedure Code does not provide for use of AI in the courts... I think it’s unconstitutional,” Ismail told Reuters. “In sentencing, judges don’t just look at the facts of the case – they also consider mitigating factors, and use their discretion. But AI cannot use discretion,” he said, adding that the sentence given by the AI to one of his clients for minor drug possession was too harsh – 12 months in jail for possession of 0.01 gram of methamphetamine.

Malaysia’s Bar Council, which represents lawyers, has also voiced its frustration with the AI pilot program. After courts in Kuala Lumpur, Malaysia’s capital, started testing the system in mid-2021 to suggest sentences for 20 types of crimes, the council said it was “not given guidelines at all, and we had no opportunity to get feedback from criminal law practitioners.”

A Malaysian think tank, Khazanah Research Institute, also filed a report on the system in 2020, arguing that the mitigating measures installed in the AI software, like removing race as a variable, didn’t succeed in making the system perfect. They also noted that the system was “somewhat limited in comparison with the extensive databases used in global efforts” as the AI algorithm was only trained using a dataset of five years from 2014 to 2019.

A spokesperson for the Federal Court Chief Justice stated that the use of artificial intelligence in courts was “still in the experimental stage” but declined to comment any further on the operation of the system.

Meanwhile, the use of AI in the criminal justice system has been growing rapidly throughout the world, from the popular DoNotPay – a chatbot lawyer mobile app – to AI judges adjudicating on small claims in Estonia, robot mediators in Canada, and even AI judges in Chinese courts.

Proponents of these AI systems insist they make sentencing more consistent and can clear case backlogs quickly and cheaply while saving time and money for parties involved in the legal proceedings.

Simon Chesterman, a professor of law at the National University of Singapore and senior director at AI Singapore – a government-run program – insists that technology has the potential to improve efficiency in the criminal justice system, but has acknowledged that the legitimacy of such systems depends not only on the accuracy of the decisions made, but also on the manner in which they are made.

“Many decisions might properly be handed over to the machines. [But] a judge should not outsource discretion to an opaque algorithm,” said Chesterman.

Back In Sabah, Ismail appealed his client’s harsh sentence which was recommended by the AI, and the judge presiding over the case eventually granted him the appeal.

However, Ismail has warned that many other lawyers, particularly young ones, may decide not to mount a challenge to the AI system – potentially condemning their clients to unduly harsh sentences.

“The AI acts like a senior judge, young magistrates may think it’s the best decision, and accept it without question,” Ismail said.