바로가기 메뉴 본문 바로가기 주메뉴 바로가기
  • 10-3aHave you reviewed whether an explanation of the model’s inference result is needed?
    • Providing an explanation about the inference results of an AI system can benefit humans in making decisions using AI, but it can also be interruptive. So, rather than giving explanations about the model’s inference results in all cases, you must first undergo a process of examining whether an explanation should be provided.

    • The following are two examples where it would be preferable to explain the model’s inference results.
    - First is when providing an explanation about the inference results does not greatly impact the decision-making of the user. If the impact of providing an explanation is not clearly analyzed, giving an explanation in detail may seem to be helpful for the user’s decision-making, but it can unexpectedly lead to confusion. For instance, users may become confused as to which result to take for their decision-making if the AI system produces two results and their predictive probability is 85.8% and 87.0%.
    - Second is when it is best not to provide a detailed explanation about the model’s inference results when the predictive probability is too high or low. If the user is informed that the predictive probability of the system’s output is 100%, it may lead to the user accepting the output unconditionally.