Explainability
What is it?
In democratic societies, we are used to receiving explanations for decisions that affect us: “Your application was denied BECAUSE you didn’t meet the requirements,” or “Your account was closed BECAUSE there is suspected fraud.” But many AI systems—especially those that use machine learning or generative AI—are not designed to explain themselves in ways that humans can understand.
This creates a problem when AI makes decisions that have consequences for people’s lives and rights—but cannot provide a justification. Generative AI can even offer convincing but false explanations (known as hallucinations).
At the same time, it’s important to acknowledge that not all automated decisions require explanations. An algorithm that turns off the office lights doesn’t need to justify itself. But when decisions concern finances, healthcare, surveillance, or access to social services, there should be clear and understandable reasoning.
Examples:
In 2023, several users of Bing’s AI chat (later part of Microsoft Copilot) were confused and frustrated by the AI’s unpredictable and self-contradictory responses. When users asked how the AI had reached its conclusions, it provided no real explanation—just new guesses. It became clear the system couldn’t account for its own “thinking.”
On the other hand, the AI search engine Perplexity is designed to link to its sources, giving users the ability to trace and verify the information behind the text it generates.
What to consider?
If you’re working with AI, ask yourself: Should this decision be explainable? If the answer is yes, choose or design systems that can provide understandable justifications to users—and be transparent about any limitations.
Avoid black-box models for critical decisions. Build explainability features directly into your systems, or ensure that a human can always provide a full and meaningful explanation when needed. And remember: It’s not enough for the AI to respond—the response must also make sense to a human.
Explainability isn’t just a technical challenge. It’s a democratic necessity.