Playing with Fire: Deconstructing the Judicial Use of ChatGPT-4 in Pakistan

The phrases ‘ChatGPT’ and ‘artificial intelligence’ (AI) have become buzzwords in recent months with the whole world marveling over the vast possibilities introduced by the popular AI chatbot. This excitement stems from the ease of use of the open access program which can create new applications and predictive reports, utilize third-party plugins and have the skill to scour the depths of the internet to provide structured, analyzed information in a clear, concise format to the user in a fraction of a second.[1] The hype has also been seen to engulf various institutions around the world. While conversations are still underway regarding the ethics and privacy concerns[2] around the usage of ChatGPT-4, on 29th March, 2023 a court in Phalia (Mandi Bahauddin)[3] in pre-arrest bail proceedings under the Anti Rape (Investigation and Trial) Act, 2021 included an ‘experimental’ use of ChatGPT-4’s responses to a series of questions within its judgment.

The case in which this supposedly innovative use of ChatGPT-4 had been conducted was about an alleged attempted rape of a minor and an accused juvenile’s consequent pre-arrest bail hearing. It goes without saying that such an unprecedented and polarizing use of ChatGPT-4 in the matter of a sexual offense with such a sensitive factual premise is unsavory, to say the least. Further, it is hierarchically unsound for a lower court to employ such an exploration not just for the reasons mentioned below but also for the sake of respecting its traditional jurisdictional bounds. It is not for lower courts to set trends, that too without paying heed to the potential negative impacts of their public ‘experiments’. Ideally, proper permission should be sought from the National Judicial Policy Making Committee (NJPMC) and the Law and Justice Commission of Pakistan (LJCP)[4] prior to using new approaches to judgment drafting rather than treating their direction as an afterthought. Simultaneously, the onus is on the bodies that regulate judicial conduct to circulate updated codes of procedure regarding emerging technologies and the engagement of judicial staff with the same.

While prefacing the use of ChatGPT-4 in the judgment, it is mentioned that the experimentation is intended to explore the possibility of the use of such software to develop ‘crisp’ and ‘smart judicial orders’ while citing the employment of automated judges in China and Dubai as aspirational models. This motive is flawed on two accounts; the first one being dependence on a large language model to write well-drafted judgments and the second one being the comparison to Dubai and China. Simply put, open-access large language models are not the answer to poorly drafted judgments. Staff training and secure editing software can be utilized to this end while also limiting privacy and cybersecurity risks. Furthermore, Pakistan does not have the relevant data protection mechanisms in place to counter violations of any fundamental rights or constitutional freedoms via the use of AI in decision-making. Pakistan’s institutional use of AI needs to be contextually relevant i.e. technology must complement the existing structures rather than transpose AI on a demographic that is neither legislatively nor sociologically prepared for a tech revolution. Additionally, both China[5] and Dubai[6] have used assistive AI within their decision-making, but only for civil disputes, such as online trading, e-commerce purchase liability, copyright cases and rental disputes. There is a reason why criminal cases, that too of the factual magnitude of the instant case, have been limited to the traditional judgment drafting procedures. This has been done to counter the intrinsic deficiency of AI with its inability to be empathetic or conscious of nonverbal cues and the causational changes in circumstances which a human judge may be naturally poised to construe better.

In the judgment, ChatGPT-4 has been prompted with various questions regarding the grant of pre-arrest bail. Its responses notify the user about its lack of access to jurisdictionally relevant case-law and incapability to provide legal advice. These responses go to the larger conversation of whether the Pakistani judiciary even has the utility of such large language models given that not all case-law is available in digital repositories for these programs to access in order to curate relevant responses. Even today, most judgments from as near as a decade ago are on paper. These documents are usually exiled to unventilated storage rooms in judicial buildings, with no soft copies available. It is impossible to create case records for large language models or predictive writing programs until and unless efforts are undertaken to digitize all case records. The responses by the chatbot have been vague and not much different than anticipated answers yielded by a simple Google search. A few days ago, ChatGPT was banned in Italy due to its lack of transparent data processing, failure to employ age verification checks and the discrepancy between the information provided and real verifiable data.[7] Reliance on such a platform, by an institution which places its prestige in being the custodian of public trust and tasked with dispensing justice, is mutually exclusive to its function.

It is constantly mentioned that the experiment is intended to pioneer a way forward for evolution in judicial decision-making and judgment drafting, but even well-intentioned projects fail if they are launched without proper research and risk management. According to the judgment, the accused juvenile’s name had been anonymized since the bail order was going to follow a unique route and the drafters were expecting a larger audience of interested readers. Unfortunately, the metadata of the judgment file reveals otherwise. Such blatant disregard for privacy can create life-threatening situations for those involved in cases of a sexual nature, especially when both the alleged perpetrator and the victim are minors. As mentioned earlier, such a glaring faux pas[8] makes it even more discernible that our courts are not yet prepared to harness the powers of AI and complex new smart programs when they cannot even undertake the simplest of clerical tasks. While the judgment does mention that the final order has not been influenced by the findings of ChatGPT-4 in granting the accused’s petition, which is not reprehensible in itself, the performative nature of the experimentation seems to be inspired by the false promises of technological solutionism.

The hollow optimism around technology shows how little effort has been put into researching the ongoing debates on manipulative[9] AI and the attack on human rights which is a resounding fear amongst legal academics and tech experts alike.[10] Lack of technological literacy can be blamed for this unnuanced approach to new technologies. There needs to be an understanding that jumping on the bandwagon and vying for relevance during trends of the 4th Industrial Revolution is not only dangerous but can also cast doubt on institutional integrity. Recognizing this, it is high time to conduct an internal assessment of Pakistan’s judicial architecture before imported technologies are forcefully used by impressionable public service officials which could result in potentially disastrous outcomes.


[1] OpenAI, ‘What is ChatGPT?’ <>
[2] Catherine Thorbecke, ‘Don’t tell anything to a chatbot that you want to keep private’ <>
[3] The State v. AM, 29th March 2023 <…-allowed.-29.03.2023.pdf>
[4] Law and Justice Commission of Pakistan, <>
[5] TechGig, ‘China launches digital courts with AI judges and mobile court system’ <>
[6] Issac John, ‘Dubai to use AI for litigation without a judge’ <>
[7] Ryan Browne, ‘Italy became the first western country to ban ChatGPT. Here’s what other countries are doing’ <,chatbot%20from%20U.S.%20startup%20OpenAI.>
[8] United Kingdom Judiciary Guidance, ‘Practice Guidance: anonymisation and avoidance of the identification of children and the treatment of explicit descriptions of sexual abuse of children in judgements intended for the public arena’ <>
[9] KU Leuven AI summer school, ‘Open Letter: We are not ready for manipulative AI – urgent need for action’, <>
[10] Future of life Institute, ‘Pause Giant AI Experiments: An Open Letter’ <>

The views expressed in this article are those of the author and do not necessarily represent the views of or any other organization with which she might be associated.

Aimen Taimur

Author: Aimen Taimur

The writer is an Advocate of the High Court with a specialization in human rights law from Queen Mary University of London. She has served as a Judicial Law Clerk at the Supreme Court of Pakistan from 2020-2022, during which she worked on developing protocols for the use of technology in court procedures. She is currently based in the Netherlands where she is pursuing a PhD in law and technology with her research focus being the legal regulation of artificial intelligence.


Daughter, very insightful writeup. You made me proud.
ChatGPT, excited me a lot. It is indeed a leap forward in information technology opened up new vistas and brought convenience to writers n researcher. Too much dependency on it may erodes ones own ability, thinking and research skills.
It is a learning software.
I may share my personal experience In a recent conference held by Legal Aid Society, on ADR, at Islamabad speaker informed Pakistan has not signed Singapore Convention on mediation and efforts are being made.
However, when I checked with chatGPT, as to status of Pakistan on Singapore Convention, it notified as follow.

“Yes, Pakistan has signed the Singapore Convention on Mediation. Pakistan signed the Convention on 7 August 2019 and ratified it on 1 March 2021. The Convention, also known as the United Nations Convention on International Settlement Agreements Resulting from Mediation, aims to promote the use of mediation in resolving cross-border commercial disputes”
I as a courtesy did not challenged the speaker, and privately shared the information. After I made my talk, I asked ChatGPT, source of information, i was directed to UN registry’s on UN Convention. I Googled and checked, the UN Convention Registry, Pakistan was absent. I again reverted to ChatGPT, for the source of information ChatGPT, referred to UN registry of Convention.
I checked, the UN Registry, it was not there.. I again referred back to Chat GPT , it confirmed and apologised for wrong information.
Generally correct information is given it is better to conter check, before passing on to others or making writing based on GPT information.
Way back I attended SCO Judges conference, where China showcased her Internet and intelligent courts system, procedure and what interested me was “Predictive Judgement” based on past precedent case laws. I initiated the project in SC and reached my parent SHC, where I still maintain contacts with IT team, they have prepared AI driven search engine to give predictive judgement, for which tons of case laws data bank is needed to be effective. At SHC it is on test run on limited area of law. Hon’able Justice Faisal Kamal my successor to IT committee, is making his best hankless efforts, he is determined to make it happen. I wish him all the success.
It is interesting area need to be developed to reduce work load on judiciary particularly district Judiciary, but as you rightly said, it should not be reckless but under certain policy guideline.
NJPMC, since I retired did not held any meetings. I wish, I had little more time to steer the smart and intelligent Court project to its logical destination.

Thank you for your kind comment. You are absolutely right sir, we are in need of serious technological integration within the judiciary but in the presence of good regulatory frameworks. I am hoping we see some good progress in this regard, with new projects building upon the foundations set in the SCP during your tenure.

Dear Aimen,
Congratulations for writing this article and discussing the Decision dated 29th March 2023, of the Judge in Phalia (Mandi Bahauddin) from several angles including data privacy, human rights, and others. I read your article “Playing with Fire: Deconstructing the Judicial Use of ChatGPT-4 in Pakistan” with great interest, and I appreciate the points you raised regarding data privacy, and human rights, in the use of AI in judicial decision-making.
Though, I do not necessarily agree with the approach taken by the Judge in Phalia while writing his Decision. Yet being a student and a researcher of technology and laws, I found your findings flawed at multiple levels.
Firstly, ‘AI’ including ‘ChatGPT’ aren’t just the hypes that have engulfed various institutions, rather these are new technological realities, which are being widely embraced by private and public institutions alike. The concerns you raised in this article regarding security and data privacy, and others, such as bias and discrimination (not mentioned by you), are certainly valid. That’s why regulatory bodies and academia from around the world are discoursing to determine ways to contain/ minimize these risks. However, it is equally important to acknowledge that the use of AI in courts is not inherently problematic.
After reading your article, one questions why the ‘experimental’ use of ‘ChatGPT’ by a district court Judge in Phalia (Mandi Bahauddin) in Pakistan, while specifically making a disclaimer that the decision of the bail application is based on his own findings, has been so harshly criticized?
In addition, being a law student, one would like to understand which law and the traditional jurisdictional bounds refrain a lower/ district court from using ChatGPT or AI or introducing any innovative methods in its working.
Since you are studying in the Netherlands, you must be aware that in the Netherlands, in the criminal justice system, routine cases are handled by the Public Prosecution Service, and only those cases where a judgment is required are brought before a court. On the contrary, the district courts in Pakistan are heavily burdened with pending cases, pertaining to all kinds of disputes. Additionally, the judges in Pakistan perform various administrative assignments like visiting prisons, hospitals, schools, shelter homes, etc. on the directions of Superior Courts and submit periodical reports to the higher courts in different fundamental rights cases.
A ‘lower’ court judge in Pakistan has to attain all these assignments on his own because foreign law graduates are not employed in the district courts for their assistance. The competence of the clerical staff of district courts is of a level that they are even unable to properly draft their own casual leave applications, not to mention the non-conducive environment in the district courts, & the technological impediments such as unavailability of even network connectivity and UpToDate hardware and software. Considering all these constraints, assume that if a lower court Judge intends to write ‘crisp’ and ‘smart judicial orders’ with the use of the latest technology, his sole purpose would be to work efficiently. The whole world is trying to contain cyber security and data privacy concerns and is aiming to achieve the ethical use of AI. However, the pace at which technology is progressing ahead is unprecedented. Reading your comment ‘technology must complement the existing structures rather than transpose AI on a demographic that is neither legislatively nor sociologically prepared for a tech revolution.’, one gets the impression that Pakistan shouldn’t try to explore innovation in judicial operations. I totally disagree with it.
Admittedly ChatGPT does not have access to legal precedents of Pakistan, and being a training model, the information provided and the references quoted by it are mostly flawed. However, if you read the Decision, the Judge has nowhere claimed that he has adjudicated the bail application based on the response received from ChatGPT.
You have mentioned that ‘our courts are not yet prepared to harness the powers of AI and complex new smart programs when they cannot even undertake the simplest of clerical tasks.’, shouldn’t this be the topic of a larger discussion that why lower courts, which are the backbone of the judicial system of the country, are so neglected that even clerical tasks cannot be performed there with precision?
While it could be agreed that there may be a lack of technological literacy and understanding of the risks associated with the use of technology in courts, however reaching the conclusion that a Judge in Phalia cannot use AI for performing his job seems an extremely narrow approach.
In conclusion, I believe that the use of AI in judicial operations is a valuable and necessary tool to improve the efficiency and effectiveness of the judicial system. While there are certainly concerns that must be addressed, we should not dismiss the potential of AI to assist judges in their work. Rather than criticizing a judge’s initiative and recommending for regulating his/others’ conduct by the regulating bodies, one should engage him/ herself in a discourse on broader aspects such as why things are not changing in district courts. How technology can be safely integrated into judicial operations to deliver justice effectively and efficiently? And why Pakistan still hasn’t been able to adopt data privacy or any other laws to regulate technology? Rather than criticizing judges for exploring innovative approaches, we should engage in a larger discussion about how to address the challenges facing our judicial system and ensure that it remains effective and just.

Comments are closed.