Air Canada’s Chatbot Misleads Passenger on Bereavement Policy After Grandmother’s Death
In a recent case that has raised questions regarding the accountability of AI technology, a Canadian man named Jake Moffatt found himself in a predicament after his grandmother passed away in Toronto. Needing to travel from British Columbia to attend the funeral, Moffatt turned to Air Canada’s website for assistance.
Engaging with the airline’s chatbot, Moffatt was advised to purchase a full-price ticket immediately and apply for a refund later under the bereavement fare policy. However, upon following through with the chatbot’s advice and attempting to claim the refund after the funeral, Moffatt was informed by the airline that the bereavement fare did not apply to completed trips, leaving him facing unexpected expenses amounting to hundreds of dollars.
During a subsequent small claims court hearing, Air Canada took the stance that they could not be held responsible for the misinformation provided by the chatbot. The company argued that the chatbot operated as a separate legal entity, absolving Air Canada of liability for the inaccurate guidance given to customers.
Ultimately, government authorities intervened in the matter and ordered Air Canada to compensate Moffatt for the discrepancy between the full and bereavement fares, totaling $812.02.
This incident underscores the challenges surrounding the deployment of chatbots in customer interactions, particularly when it comes to issues of liability. As businesses increasingly rely on AI technology for customer service, the question of accountability for errors or misleading information becomes more prominent.
While companies may attempt to shift responsibility to the AI tools they employ, the case of Air Canada sets a precedent for holding organizations accountable for the actions and advice provided by their chatbots. The evolving landscape of AI technology requires a closer examination of the legal and ethical implications of using such tools in customer-facing interactions.
As the use of chatbots continues to grow, it is crucial for companies to ensure that these AI systems are equipped to provide accurate information and support to customers. The case of Jake Moffatt serves as a cautionary tale for businesses navigating the complexities of AI-driven customer service and highlights the importance of maintaining transparency and accountability in the deployment of such technologies.