Identifying the "Right" Level of Explanation in a Given Situation - Equipe Image, Modélisation, Analyse, GEométrie, Synthèse Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Identifying the "Right" Level of Explanation in a Given Situation

Une méthode pour identifier le niveau optimal d'explication de l'IA dans une situation donnée

Résumé

We present a framework for defining the "right" level of AI explainability based on technical, legal and economic considerations. Our approach involves three logical steps: First, define the main con-textual factors, such as who is the audience of the explanation, the operational context, the level of harm that the system could cause, and the legal/regulatory framework. This step will help characterize the operational and legal needs for explanation, and the corresponding social benefits. Second, examine the technical tools available, including post-hoc approaches (input perturbation, saliency maps...) and hybrid AI approaches. Third, as function of the first two steps, choose the right levels of global and local explanation outputs, taking into the account the costs involved. We identify seven kinds of costs and emphasize that explanations are socially useful only when total social benefits exceed costs.
Fichier principal
Vignette du fichier
main.pdf (78.53 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02507316 , version 1 (13-03-2020)

Identifiants

  • HAL Id : hal-02507316 , version 1

Citer

Valérie Beaudouin, Isabelle Bloch, David Bounie, Stéphan Clémençon, Florence d'Alché-Buc, et al.. Identifying the "Right" Level of Explanation in a Given Situation. 1st International Workshop on New Foundations for Human-Centered AI (NeHuAI) ECAI 2020, Sep 2020, Santiago de Compostella, Spain. pp.63-66. ⟨hal-02507316⟩
974 Consultations
616 Téléchargements

Partager

Gmail Facebook X LinkedIn More