Determine applicability: Consider this question if guidelines and policies on AI governance have been prepared according to "02-1," and determine if the requirement has been satisfied.
• As mentioned in "02-1," healthcare AI systems possess risk factors that may lead to damage or death due to unconfirmed errors in the systems. A group with management and supervision responsibilities must be in place to identify various risk factors, prepare related policies, and implement the policies.
• An AI governance group must prepare regulations to promote the reliability of the clinician as well as patient safety and liability, and must oversee compliance with guidelines and whether procedural requirements have been met. This group needs to be composed of personnel qualified with related competencies and be aware of the roles and responsibilities of each person.
• If possible, an AI governance group should also include external experts (e.g. doctors, nurses, data scientists, clinical quality managers, academic faculty). External experts help overcome problems, e.g. groupthink, and improve biased point of view within the group. In addition, it is possible to maintain an agile and innovative approach to healthcare AI use cases by acquiring methodology and professional knowledge required to operate AI systems from external experts.