posted on 2022-07-25, 00:33authored byR T O'Donnell, A E Nicholson, B Han, K B Korb, M J Alam, L R Hope
Bayesian networks (BNs) are rapidly becoming a leading tool in applied Artificial Intelligence (AI). BNs may be built by eliciting expert knowledge or learned via causal discovery programs. Both approaches have limitations: expert elicitation is expensive, time-consuming and relies on experts having full domain knowledge, while discovery is often ineffective given small or noisy datasets. A hybrid approach is to incorporate prior information elicited from experts into the causal discovery process. We present several ways of using expert information as prior probabilities in the CaMML causal discovery program. We compare CaMML with and without prior information to a variety of other BN learners. Our results show that CaMML achieves comparable results without prior information and superior performance with prior information. We also present results showing that CaMML is well calibrated to variations in the expert's skill and confidence.