Article Text

Download PDFPDF

Towards evidence based medicine for paediatricians
    1. Correspondence to Dr Bob Phillips, Centre for Reviews and Dissemination, University of York, York YO10 5DD, UK; bob.phillips{at}doctors.org.uk

    Statistics from Altmetric.com

    Request Permissions

    If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

    More than lucky

    When two people/researchers/clinicians are looking at something, we often want to know if they are seeing/deciding the same thing. It has been said (but not by me, who relies on them too much) that two radiologists reviewing the same chest radiograph will have three different opinions on if there's pneumonia or not. Measuring the amount of agreement is often done using the κ statistic.

    κ is a measure of agreement, usually between two observers of a dichotomous outcome although there are variants for multiple observers. It gives you a measure of what agreement you see that is ‘beyond chance’

    Take an observer reporting on the colour of balls emerging from a bag, knowing there were five blue and five yellow. If they didn't look, and just guessed, you'd expect them to be right 50% of the time. If there were two observers, both distracted by reading the editorial in ADC F&N, you would expect the pair of them to be right 0.5×0.5=0.25 (25%) of the time.

    Anything over this is “agreement beyond chance”.

    The value given to κ reflects the degree of agreement, 0 is perfect disagreement, 1, perfect agreement, in between is usually interpreted

    κStrength of agreement
    <0.20Poor
    0.21–0.4Fair
    0.41–0.6Moderate
    0.61–0.8Good
    0.81–1.0Excellent

    How good you ‘need’ this to be depends upon what you're using it for. In the same way there is no ‘right’ wine to go with a meal, there is no ‘correct’ κ value.

    Footnotes

    • Competing interests None.

    • Provenance and peer review Commissioned; internally peer reviewed.

    Linked Articles