Article Text
Abstract
Aims Reducing inter examiner variability is crucial for exams. Training of examiners is seen as essential in ensuring validity and reliability of clinical examinations. We aimed to develop an interactive online package to aid training of examiners assessing undergraduate paediatric students and to evaluate inter examiner variability.
Methods Using publicly available Google Sites, we created an online tool to train examiners for their role in paediatric undergraduate clinical skills assessments. It comprised a sequence of five videos of students, each presenting a clinical case(history and examination of a child). These case presentations were scored (scale 0–15 for the total score) by examiners online using an interactive mark sheet that automatically recorded the scores. Subsequently, examiners could compare their scores against an average given by a panel of senior expert examiners. In addition, recorded data were analysed for overall mean scores and standard deviation (SD). The students were ranked according to performance (1 excellent, 1 clear fail and three in between) using predetermined criteria
Results Total of 31 participants, 18 of them fully completed the online package.
Conclusions Trainee examiners considered the tool helpful, especially if they were to perform the clinical skills assessments for the first time. Results demonstrate variation of scores is higher among trainee examiners, apart from student number 2. Overall scores given by trainee examiners tend to be lower compared to experienced expert examiners.