This airline’s AI hallucinates and “rips off” a customer, the company must pay

An airline’s AI misadvised a customer by hallucinating his response. Sued in court, she believed that the chatbot was an entity in its own right and therefore responsible for its actions. Lost.

Air Canada
Credits: 123RF

In 2022, Jake Moffat must take a plane ticket with Air Canada to travel to the funeral of a family member. As is often suggested in this type of case, the man inquires to know the conditions for obtaining the death rate. He asks his question toartificial intelligence taking the form of a chatbot on the Air Canada website. The latter tells him that he must complete a form “within 90 days following the date of issue of the ticket”. He will then receive the difference between the full price and the special price.

Armed with this information, Moffat paid full price for his ticket and made his request after his return. The airline refuses indicating thatit’s too late for this to the extent that the journey has been made. It directs the customer to the page of the site dedicated to the question of the death rate. When the man shows the screenshot of his conversation with the AI, Air Canada acknowledges that the chatbot used “misleading words” in its response and promises that it will update it. In other words, artificial intelligence has hallucinated the conditions for obtaining the death rate.

This airline wanted the AI ​​to be found responsible for its hallucination

The phenomenon is not new, it even led to the conviction of two lawyers in 2023. Still, Jake Moffat is taking Air Canada to court. The latter has an original defense to say the least: it asserts that the AI ​​is a “separate legal entity” from the company and therefore responsible for one’s own actions. An argument swept aside by tribunal member Christopher Rivers. “Even though a chatbot has an interactive component, it is only part of the Air Canada website. It should be obvious to Air Canada that it is responsible for all information contained on its website. It doesn’t matter if they come from a static page or a chatbot”.

Read also – These AIs simulate wars and hallucinate about Star Wars and The Matrix

Same story when the company points out that information on the death rate is available on its site. “There is no reason for Mr. Moffatt to know that one section of the Air Canada web page is accurate and another is not,” he responds. The firm was ordered to pay $650.8which is the difference between the price of the full price ticket and the death rate, plus fees amounting to $161.14 in total.

Source: The Guardian

Source link -101