바로가기 메뉴 본문 바로가기 주메뉴 바로가기
  • 10-1bDo you provide evidence of the model’s inference results that are acceptable in addition to XAI?
    • It is not always possible to explain an AI model’s inference result and evidence of the AI’s decision. Also, deploying AI technology in clinical practice requires a thorough validation of clinical utility and safety, and an explanation of the inference result may not be sufficient even with the application of XAI [49].

    • Particularly, medical devices that incorporate machine learning must ensure transparency in order to gain the trust of medical staff and patients. When designing methods for clinical trials, it is necessary to provide justifications for establishing clinical efficacy evaluation variables and success criteria for clinical trial evaluation results [50].

    • There may be a comparison with the accuracy of the clinician’s reading during the validation to provide the evidence for the model’s inference results. It may also involve the participation of an experienced clinician when setting the ground truth for the validation. The acceptability for inference results must be increased by including experienced clinicians in the clinical trial for explaining the validation process of the AI model [8, 51].