• Air Canada has lost a court case over its chatbot offering a fake insurance policy to a customer.
  • For air Canada, this should be responsible for all the information on a website.

Air Canada has argued that its AI-powered customer chatbots are solely responsible for their actions, but that argument won’t hold up in civil court.Now, the airline must refund a customer who was given the wrong ticket compensation information.


Inaccurate information

The 2022 incident involved an Air Canada passenger, Jake Moffatt, and the airline’s chatbot, which Moffatt used to get information on how to get last-minute bereavation payments while attending a funeral.The chatbot explained that Moffat could retrofully claim a refund of the difference between a regular ticket and bereavement fees, as long as it was within 90 days of purchase. When Air Canada refused to reimburse because of incorrect information, Moffat took them to court. Air Canada’s arguments against the refund include that they are not responsible for the chatbot’s “misleading statements.” Air Canada also argued that the chatbot is a “separate legal entity” that should be held accountable for its actions, claiming that the airline is also not responsible for information provided by “agents, servants or representatives – including chatbots.” Whatever that means.

Responsibility is unavoidable

A Canadian court member responded: “While the chatbot has an interactive component, it is still only part of the Air Canada website.””For Air Canada, this is responsible for all the information on the website. It makes no difference whether the information comes from a static page or a chatbot.”

Also read: Google’s Bard chatbot gets the Gemini Pro update globally