Air Canada found liable for chatbot's bad advice on plane tickets

A screengrab from Air Canada's customer information page. The B.C. Civil Resolution Tribunal found the airline liable for bad advice offered by a chatbot on the company's website that meant a passenger couldn't claim a bereavement rate. (Air Canada - image credit)
A screengrab from Air Canada's customer information page. The B.C. Civil Resolution Tribunal found the airline liable for bad advice offered by a chatbot on the company's website that meant a passenger couldn't claim a bereavement rate. (Air Canada - image credit)

Air Canada has been ordered to pay compensation to a grieving grandchild who claimed they were misled into purchasing full-price flight tickets by an ill-informed chatbot.

In an argument that appeared to flabbergast a small claims adjudicator in British Columbia, the airline attempted to distance itself from its own chatbot's bad advice by claiming the online tool was "a separate legal entity that is responsible for its own actions."

"This is a remarkable submission," Civil Resolution Tribunal (CRT) member Christopher Rivers wrote.

"While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."

'Misleading words'

In a decision released this week, Rivers ordered Air Canada to pay Jake Moffatt $812 to cover the difference between the airline's bereavement rates and the $1,630.36 they paid for full-price tickets to and from Toronto bought after their grandmother died.

Moffatt's grandmother died on Remembrance Day 2022. Moffatt visited Air Canada's website the same day.

The Canadian Transportation Agency wrote that Air Canada's equipment purchases and planning decisions created inaccessible routes in its service. 
The Canadian Transportation Agency wrote that Air Canada's equipment purchases and planning decisions created inaccessible routes in its service.

Jake Moffatt claimed they bought full-fare tickets to Toronto and back based on a chatbot's advice that they could retroactively make a bereavement claim. (CBC / Radio-Canada)

"While using Air Canada's website, they interacted with a support chatbot," the decision says.

Moffatt provided the CRT with a screenshot of the chatbot's words: "If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form."

Based on that assurance, Moffatt claimed they booked full-fare tickets to and from Toronto.

But when they contacted Air Canada to get their money back, they were told bereavement rates don't apply to completed travel — something explained on a different part of their website.

Moffatt sent a copy of the screenshot to Air Canada — pointing out the chatbot's advice to the contrary.

"An Air Canada representative responded and admitted the chatbot had provided 'misleading words,'" Rivers wrote.

"The representative pointed out the chatbot's link to the bereavement travel webpage and said Air Canada had noted the issue so it could update the chatbot."

Apparently, Moffatt found that cold comfort — and opted to sue instead.

'Reasonable care' not taken to ensure accuracy: CRT

According to the decision, Air Canada argued that it can't be held liable for information provided by one its "agents, servants or representatives — including a chatbot."

But Rivers noted that the airline "does not explain why it believes that is the case."

Air Canada planes sit on the tarmac at Pearson International Airport  during the COVID-19 pandemic in Toronto on Wednesday, April 28, 2021.
Air Canada planes sit on the tarmac at Pearson International Airport during the COVID-19 pandemic in Toronto on Wednesday, April 28, 2021.

Air Canada claimed it could not be held liable for information provided by its chatbot. But the Civil Resolution Tribunal disagreed. (Nathan Denette/The Canadian Press)

"I find Air Canada did not take reasonable care to ensure its chatbot was accurate," Rivers concluded.

Air Canada argued Moffatt could have found the correct information about bereavement rates on another part of the airline's website.

But as Rivers pointed out, "it does not explain why the webpage titled "Bereavement Travel" was inherently more trustworthy than its chatbot."

"There is no reason why Mr. Moffatt should know that one section of Air Canada's webpage is accurate, and another is not," Rivers wrote.

A survey of the Canadian Legal Information Institute — which maintains a database of Canadian legal decisions — shows a paucity of cases featuring bad advice from chatbots; Moffatt's appears to be the first.