The utilization of artificial intelligence (AI) in the criminal justice system is swiftly expanding around the world. But Malaysian lawyers are now outraged over use of AI in courts.
- MAJOR PEER REVIEWED STUDY: Moderna Vaccine Increases Myocarditis Risk By 44 Times In Young Adults
- MUST READ: High Level International Bankers Simulate The Collapse Of Global Financial System
- BIG STORY: Wuhan Lab Isolated Monkeypox Strain In 2020
- EXPLOSIVE: Ukraine Biolabs Used Fever Carrying Mosquitoes To Spark Dengue Pandemic In Cuba
Malaysian lawyers contend that using an AI system in the country’s justice system is “unconstitutional,” and that no one grasps how it appears to work. This comes after courts in two Malaysian states initiated a pilot program to use artificial intelligence to help judges give sentences to convicted drug dealers and rapists.
The AI software, created by the state government firm Sarawak Information Systems, had first been introduced in 2020 to two courts on the island of Borneo, Sabah and Sarawak, as part of a pilot project to test the efficacy of artificial intelligence in sentencing recommendations. The test was scheduled to conclude in April 2022.
When a court in Sabah convicted two men for drug possession in 2020, it was the first in the nation to employ AI to help convey a court sentence. However, Hadid Ismail, a 20-year-experienced lawyer who represented the defendants, disputed the sentence, asserting that the system has been used before judges, lawyers, and the public had an opportunity to completely comprehend it and how it acted.
“Our Criminal Procedure Code does not provide for use of AI in the courts… I think it’s unconstitutional,” Ismail told Reuters. “In sentencing, judges don’t just look at the facts of the case – they also consider mitigating factors, and use their discretion. But AI cannot use discretion,” he said, going on to add that the AI’s punishment for low level drug custody on one of his clients was too severe – 12 months in prison for custody of 0.01 gram of methamphetamine.
Subscribe to GreatGameIndia
The Malaysian Bar Council, which represents lawyers, has also expressed its displeasure with the AI pilot program. The council stated that after courts in Malaysia’s capital, Kuala Lumpur, began testing the system in mid-2021 to recommend sentences for 20 types of crimes, it was “not given guidelines at all, and we had no opportunity to get feedback from criminal law practitioners.”
A Malaysian think tank, Khazanah Research Institute, also submitted a report on the system in 2020, making the argument that the mitigating measures implemented in the AI software, such as stripping away race as a variable, failed to make the system perfect. They also mentioned that the system was “somewhat limited in comparison with the extensive databases used in global efforts” because the AI algorithm was only taught using a five-year range of data from 2014 to 2019.
A spokesperson for the Chief Justice of the Federal Court stated that using artificial intelligence in court system was “still in the experimental stage,” but refuses to elaborate more on the system’s procedure.
In the meantime, the utilization of artificial intelligence (AI) in the criminal justice system is swiftly expanding around the world, from the famous DoNotPay – a chatbot lawyer mobile app – to AI judges adjudicating on small claims in Estonia, robot mediators in Canada, and even AI judges in Chinese courts.
People in favor of these AI systems argue that they improve consistency in sentencing and therefore can clear case backlogs faster and cheaper, saving time and money for all parties engaged in the legal deliberations.
Simon Chesterman, a law professor at the National University of Singapore and the senior director of AI Singapore, a government-run program, believes that method has the capacity to increase productivity in the criminal justice system, but he recognizes that the validity of such systems is dependent not only on the precision of decisions made, but also on how they are made.
“Many decisions might properly be handed over to the machines. [But] a judge should not outsource discretion to an opaque algorithm,” said Chesterman.
Back in Sabah, Ismail filed an appeal against his client’s severe punishment, as suggested by the AI, and the presiding judge over the case ultimately granted his request.
Ismail, on the other hand, has warned that several lawyers, especially young ones, might very well end up deciding not to contest the AI system, possibly subjecting their clients to excessively harsh sentences.
“The AI acts like a senior judge, young magistrates may think it’s the best decision, and accept it without question,” Ismail said.