Jump to main content
Jump to site search

Issue 1, 2013
Previous Article Next Article

Assessing confidence in predictions made by knowledge-based systems

Author affiliations


A new metric, “veracity”, is proposed for assessing the performance of qualitative, reasoning-based prediction systems that takes into account the ability of these systems to express levels of confidence in their predictions. Veracity is shown to be compatible with concordance and it is hoped that it will provide a useful alternative to concordance and other Cooper statistics for the assessment of reasoning-based systems and for comparing them with other types of prediction system. A few datasets for four end points covered by the program, Derek for Windows, have been used to illustrate calculations of veracity. The levels of confidence expressed by Derek for Windows in these examples are shown to carry meaningful information. The approach provides a way of judging how well open predictions (“nothing to report” in Derek for Windows) can support qualified predictions of inactivity.

Graphical abstract: Assessing confidence in predictions made by knowledge-based systems

Back to tab navigation

Additions and corrections

Article information

18 Jun 2012
21 Sep 2012
First published
26 Sep 2012

Toxicol. Res., 2013,2, 70-79
Article type

Search articles by author