Air Canada must adhere to a refund policy invented by their chatbot

After months of resistance, Air Canada was forced to provide a partial refund to a grieving passenger who was misled by an airline chatbot by misexplaining the airline's bereavement travel policy.

The day Jake Moffatt's grandmother died, Moffat immediately went to Air Canada's website to book a flight from Vancouver to Toronto. Moffatt wasn't sure how Air Canada's grief rates work and asked Air Canada's chatbot for an explanation.

The chatbot provided inaccurate information and encouraged Moffatt to book a flight immediately and then request a refund within 90 days. In fact, Air Canada's policy specifically states that the airline will not provide refunds for bereavement travel once the flight has been booked. Moffatt dutifully tried to follow the chatbot's advice and request a refund, but was shocked to have the request rejected.

Moffatt tried for months to convince Air Canada that a refund was owed by posting a screenshot of the chatbot that clearly claimed:

If you need to travel immediately or have already traveled and would like to submit your ticket at a discounted bereavement rate, please do so within 90 days of the issue date of your ticket by completing our ticket refund request form.

Air Canada argued that Moffatt should have known that bereavement pricing could not be requested retroactively because the chatbot response elsewhere linked to a page with actual bereavement travel policies. Instead of a refund, the best thing Air Canada would do would be to promise to update the chatbot and offer Moffatt a $200 voucher to use on a future flight.

Unsatisfied with this solution, Moffatt rejected the coupon and filed a small claims lawsuit with the Canadian Civil Resolution Tribunal.

According to Air Canada, Moffatt should never have trusted the chatbot and the airline should not be held liable for the chatbot's misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity responsible for its own actions.” , a court order said.

Experts told the Vancouver Sun that Moffatt's case appears to be the first time a Canadian company has argued that it is not liable for information provided by its chatbot.

Tribunal member Christopher Rivers, who decided the case in Moffatt's favor, called Air Canada's defense “remarkable.”

“Air Canada argues that it cannot be held liable for information provided by any of its agents, servants or representatives – including a chatbot,” Rivers wrote. “It doesn't explain why it believes this to be the case” or “why the website called 'Grief Journeys' was inherently more trustworthy than its chatbot.”

Additionally, Rivers found that Moffatt had “no reason” to believe that one part of Air Canada's website was accurate and another was not.

Air Canada “does not explain why customers need to double-check information they find in one part of its website in another part of its website,” Rivers wrote.

In the end, Rivers ruled that Moffatt was entitled to a partial refund of C$650.88 on the original airfare (approximately $482), which was C$1,640.36 (approximately $1,216), as well as additional damages to cover the Interest on the airfare and Moffatt's court fees.

Air Canada told Ars it will comply with the ruling and considers the matter closed.

Air Canada's chatbot appears to be disabled

When Ars visited Air Canada's website on Friday, there appeared to be no chatbot support, suggesting Air Canada has disabled the chatbot.

Air Canada did not respond to Ars' request to confirm whether the chatbot is still part of the airline's online support offering.