Too greedy of ChatGPT, why does this lawyer risk being sanctioned?


Vincent Mannessier

May 30, 2023 at 5:10 p.m.

6

chatGPT © Vitor Miranda / Shutterstock.com

Vitor Miranda / Shutterstock.com

ChatGPT is a priori not yet capable of replacing lawyers. However, this should not prevent the American lawyer Steven Schwartz from losing his job because of the chatbot.

The reason ? This New York lawyer took a little too much help from this AI to put together an indictment, without checking the accuracy of what ChatGPT wrote. The result, predictable for some, is that his file referenced several previous similar cases… but which did not exist.

What do we blame Schwartz for?

Some time ago, Roberto Mata, who claims to have been injured during a Colombia Airlines flight, contacts Levidow, Levidow, and Oberman, the legal firm that employs Schwartz. Faced with the efforts of the airline to ask the judge seized to refuse the case, Schwartz is responsible for drafting a brief for the judge, in order to present the arguments in favor of the continuation of the trial. The document, which is very exhaustive, is more than 10 pages, and references many similar cases that have given rise to trials. This is where our story begins.

Because among all those who had the file in their hands, no one could find traces of three of these cases. The reason is simple: they don’t exist, ChatGPT invented everything. Confronted with this, Schwartz explained that he only used AI to add details to his documents, and was unaware that AI had the ability to invent cases. To prove his good faith, he pulled out a screenshot showing that he asked the chatbot if these cases were real, to which the AI ​​would have said yes, and that these could be found in authentic bases. legal data.

Pixabay Justice Judgment © Pixabay

©Pixabay

ChatGPT, not yet developed as the lawyer of tomorrow

Beyond the naivety or the lack of seriousness of Schwartz, it is far from being the first time that an artificial intelligence “hallucinates”, that is to say asserts facts as truths when it have been shown to be false. Even Google, during a presentation by Bard, had suffered from this type of disappointment.

And this problem, which may only be incidental if AI were to replace other professions, would be much more serious if it took the place of doctors, journalists… or lawyers. It also comes to question the excitement around ChatGPT’s attorney skills. Because if this chatbot is apparently able to pass exams allowing it to become one, not sure that such a defect allows it to easily find customers.

Sources: engage, Numerama



Source link -99