A British Columbia Civil Resolution Tribunal has made a significant ruling, establishing a precedent for airline accountability in the era of AI-powered customer care, by ruling that Air Canada is bound by a refund policy created by its own chatbot. Jake Moffatt filed a claim for a refund of his bereavement fare following the death of his grandma. However, Moffatt was misled by Air Canada’s chatbot into thinking he could get his refund even after his flight was over.
A Misleading Chatbot and a Frustrated Passenger:
Jake Moffatt went to the Air Canada website the day his grandma passed away in order to arrange a flight from Vancouver to Toronto. He enquired of the airline’s chatbot about the bereavement fare policy, not knowing exactly what it was. The chatbot mistakenly told him he could book the flight right away and then ask for a refund within 90 days, rather than sending him to the official policy. Moffatt purchased the travel based on the information he was given, but when he went to get a refund, Air Canada firmly denied his request.
Air Canada’s Refusal and the Legal Battle:
Air Canada acknowledged that the chatbot was wrong, but they refused to give the refund, claiming that since the chatbot was not a human agent, it was not accountable for the false information it provided. They also asserted that the official policy on bereavement fares, which expressly prohibits refunds for trips that are completed, ought to take priority. But Moffatt persisted, believing he had been misled and abused, and he brought his case to the Civil Resolution Tribunal.
In a landmark ruling, the Tribunal found in favor of Moffatt. According to Christopher Rivers, a tribunal member, Air Canada “failed to take reasonable care to ensure their representations were not misleading” and had a “duty to be accurate.” He emphasized that Air Canada was in charge of the chatbot’s content and accuracy because it was made available on its official website, which promoted it as a reliable source of information.
A Case Study of AI-Powered Customer Support:
This decision represents a major advancement in the field of AI-powered customer support. It highlights that, despite chatbots not being human agents, businesses still have a need to guarantee the accuracy of the information they offer. With businesses depending more and more on AI to engage with customers, the Moffatt case establishes a standard for holding them responsible for false information and its effects.
The choice also underlines the possible drawbacks of using chatbots only to answer complicated questions. While chatbots can be helpful in the short term and provide convenience, in sensitive situations or in regions with complex policies, human interaction may be required. The scenario involving Air Canada serves as a reminder that chatbots should be utilized sensibly, with obvious disclaimers and easy access to human support in case of complicated circumstances.
This decision has consequences that go beyond Air Canada. Airlines and other businesses that use chatbots should take note of this story and prioritise accuracy, transparency, and ethical implementation. Additionally, it empowers customers by serving as a helpful reminder to be wary of information from chatbots and to get human confirmation before making important decisions, especially when handling private or financial affairs.