Moderating:
The topic is decisions (etc.) written by AI, not by clerks. For the purpose of this thread, it doesn’t matter much whether the decision was written by the judge, or ghost written for the judge by a clerk. Let’s get back on topic, please.
Moderating:
The topic is decisions (etc.) written by AI, not by clerks. For the purpose of this thread, it doesn’t matter much whether the decision was written by the judge, or ghost written for the judge by a clerk. Let’s get back on topic, please.
Here’s a fairly detailed news article about issues that have come up in Canadian courts. One point it makes is that clients and self-reps are now doing their “research” through AI, which increases their risks and can get them hit with court costs personally.
And two recent cases from Cabada. In one, the Alberta Court of Appeal ordered the lead lawyer on a file to pay $17,000 costs personally to the other party, for costs thrown away responding to the AI hallucinations in the lawyer’s brief. (Text of judgment is circulating, but I can’t find a link.)
The other is more serious. Not only did the lawyer file a brief with hallucinations, it’s alleged that she denied doing so to the judge on the case. The judge referred it to the Attorney General for Ontario for possible criminal contempt proceedings.