Production AI Institute — vendor-neutral certification for AI practitioners
Verify a credentialFor organisationsContact
AI Incident Registry
HighAviation·2024·Air Canada

Air Canada Chatbot Bereavement Fare

Air Canada's AI chatbot incorrectly told a customer he could apply for a bereavement discount retroactively. When the customer attempted to claim the refund, Air Canada refused — arguing the chatbot was a 'separate legal entity' not bound by its advice. A Canadian tribunal rejected this defence and ordered Air Canada to honour the discount.

D1 · Input GovernanceD5 · Deployment Safety

What happened

Jake Moffatt's grandmother died and he used Air Canada's chatbot to ask about bereavement fares before booking flights. The chatbot told him he could apply for the discounted rate retroactively within 90 days of travel. Air Canada's actual policy does not allow retroactive applications. When Moffatt applied, Air Canada refused to honour the chatbot's advice and suggested he should have checked the policy page directly instead.

PSF Analysis

How the Production Safety Framework maps to this failure

This is a canonical D1 failure: the system allowed the AI to generate policy guidance without grounding it in verified, current policy documents. The retroactive refund clause was simply invented by the model. D5 was also absent — no pre-deployment testing appears to have validated the chatbot against edge-case policy questions. The 'chatbot as separate entity' legal argument, had it succeeded, would have created a precedent allowing organisations to disclaim AI outputs entirely — a risk that makes robust D5 deployment validation even more critical.

Controls that would have prevented this

Specific PSF controls mapped to each failure point

1
D1 · Input Governance
Detect and route bereavement policy questions to authoritative policy text rather than letting the LLM paraphrase.
2
D5 · Deployment Safety
Require human review for AI responses involving financial commitments or policy exceptions before customer-facing deployment.
3
D6 · Human Oversight
Provide clear escalation for high-stakes booking queries — a human agent or authoritative policy link.

Outcome

Air Canada ordered to pay CAD 812.02. Significant reputational damage. The case became widely cited as a landmark in AI contract liability.

contract-liabilityhallucinationpolicy-compliancechatbot

Related incidents

Critical2016
Microsoft Tay Chatbot Taught to Produce Hate Speech
D1D2
Critical2018
Uber Self-Driving Car Kills Pedestrian in Arizona
D6D5
High2024
Google Gemini Generated Historically Inaccurate Images
D2D1
NEXT STEP

Prove you understand how to prevent failures like this

The AIDA exam tests PSF knowledge across all 8 domains. Free to take, immediately verifiable.

Take the AIDA exam →← All incidents