Skip to main content

A Proposal for the Explanation Faithfulness Evaluation Measures Ontology (EFEMO)

Danielle Villa

As artificial intelligence systems become more advanced, there is a growing need to explain those systems. However there is no consensus on how to ensure those explanations are accurate to the system’s internal processes. Dozens of measures have been proposed that each make different assumptions, evaluate specific types of explanations, and require different levels of model access. This can make it difficult to determine which measure is appropriate for a given situation. We propose the development of the Explanation Faithfulness Evaluation Measures Ontology. This would organize the variety of measures and serve as the basis for a measure recommendation system. This would be done with a thorough analysis of the current literature, development of the ontology, and building a user-friendly recommendation system. We have already begun this process and hope the final product will encourage researchers to evaluate their explanations and produce better explainability methods.

 

Links:


Danielle

TWC Faculty

Research Staff

Research Assistants