Determine applicability: Consider this question when there are diverse subjects using the AI system, and determine if the requirement has been satisfied.
• If the AI system requires an explanation of the system’s output, there may be differences in the evaluation criteria for explainability*, which is the level of understanding of how the output was produced based on the user (e.g. medical staff, patient) who checks the system’s output. The assessment criteria on interpretability, the understanding of how AI operates, also depends on the user.
* ISO/IEC TR 29119-11:2020 defines explainability as “the level of understanding how the AI-based system came up with a given result,” and interpretability as “the level of understanding how the underlying AI technology works.”
• Therefore, it is best to organize a consultative group to prepare evaluation criteria for the explainability and interpretability of system output or decide the expected output in order to test an AI system that can be interpreted with different criteria by each medical staff and patient, and then design the test methodology through consultations among members.