바로가기 메뉴 본문 바로가기 주메뉴 바로가기
  • 11-1aHave you examined the possibility of bias in the source code, such as the implementation process of the data access method?
    • Various biases can be created in the AI system such as omitting certain class access while implementing the access method of data you will use in the model into the code.

    • Especially in rule-based systems, it is desirable to recruit experts with experience in a variety of fields. When using hard-coded rules based on the knowledge of experts in a particular field, the output may be biased toward a specific class, resulting in potential cognitive bias. Therefore, selecting experts with various background knowledge and experience may help you in mitigating bias in the system.

    • Open-source tools (e.g. FairML, Google's What-If Tool) may be utilized in AI system design and development to discover hidden bias by periodically analyzing statistics of output data or to inform the risks in functions according to the predefined fairness metric. Using these tools can allow prompt discovery of and response to bias in the system implementation.