Science and technology for the eXplanation of AI decision making
XAI project focuses on the urgent open challenge of how to construct meaningful explanations of opaque AI/ML systems, introducing the local-to-global framework for black box explanation, articulated along three lines: a) the language for expressing; b) the inference of local explanations; c), the bottom-up generalization of many local explanations into simple global ones
Grant agreement ID: 834756
Total cost: 2 500 000€
EU Contribution: 2 500 000€
Principal investigator: Fosca Giannotti
Email: fosca.giannotti @ isti.cnr.it
An intertwined line of research will investigate i) causal explanation models that capture the causal relationships among the variables and the decision, and ii) mechanistic/physical models that capture the detailed data generation behavior behind specific deep learning models.
This project will also develop: (1) an explanation infrastructure for benchmarking, equipped with platforms for the users' assessment of the explanations; (2) an ethical-legal framework, in compliance with the provisions of the GDPR; and (3) a repertoire of case studies in explanation-by-design, mainly focused on health and fraud detection applications.
National Research Council
University of Pisa
Department of Computer Science
Scuola Normale Superiore
The 5 XAI
Local-to-global paradigm for explanation by design
From statistical to causal and mechanistic, physical explanations
Ethical/legal framework for explanation
Events, tutorials, round tables, conferences and more...
Falling walls foundation | Jean-Pierre Bourguignon, Edith Heard, Fosca Giannotti, Sabina Leonelli
In this tutorial Riccardo shows how to employ existing explanation libraries on tabular datasets
Artificial Intelligence The ineluctable revolution, virtual session on“Artificial Intelligence: A blessing or a threat for society",