Jump to main content
Jump to site search

Issue 1, 2013
Previous Article Next Article

Assessing confidence in predictions made by knowledge-based systems

Author affiliations

Abstract

A new metric, “veracity”, is proposed for assessing the performance of qualitative, reasoning-based prediction systems that takes into account the ability of these systems to express levels of confidence in their predictions. Veracity is shown to be compatible with concordance and it is hoped that it will provide a useful alternative to concordance and other Cooper statistics for the assessment of reasoning-based systems and for comparing them with other types of prediction system. A few datasets for four end points covered by the program, Derek for Windows, have been used to illustrate calculations of veracity. The levels of confidence expressed by Derek for Windows in these examples are shown to carry meaningful information. The approach provides a way of judging how well open predictions (“nothing to report” in Derek for Windows) can support qualified predictions of inactivity.

Graphical abstract: Assessing confidence in predictions made by knowledge-based systems

Back to tab navigation

Additions and corrections

Publication details

The article was received on 18 Jun 2012, accepted on 21 Sep 2012 and first published on 26 Sep 2012


Article type: Paper
DOI: 10.1039/C2TX20037F
Citation: Toxicol. Res., 2013,2, 70-79
  •   Request permissions

    Assessing confidence in predictions made by knowledge-based systems

    P. N. Judson, S. A. Stalford and J. Vessey, Toxicol. Res., 2013, 2, 70
    DOI: 10.1039/C2TX20037F

Search articles by author

Spotlight

Advertisements