Article Text
Abstract
Objective Medication errors are common, with junior doctors accounting for the majority in acute healthcare. Paediatrics is uniquely challenging, but the evidence base to guide prescribing education is limited. The authors set out to develop a short, educationally sound, low cost e-learning resource for paediatric prescribing to improve junior doctors' prescribing skills and to evaluate its effectiveness.
Design A non-blinded randomised controlled trial.
Setting North Western Deanery Foundation School, UK.
Participants 162 volunteer foundation (junior) doctors randomised into control (86) and intervention (76) groups.
Interventions On study entry, participants were assessed on prescribing skill, prescribing habits and confidence. The intervention group completed the e-learning course designed for the study, which took 1–2 h. At 1 and 3 months after the intervention, both groups were assessed on similar prescribing assessments, habits and confidence.
Main outcome measures Total score (expressed as a percentage) on prescribing assessments, confidence and satisfaction scores.
Results There were no preintervention differences in prescribing assessments (67% vs 67%, p=0.56). Postintervention, the e-learning group scored significantly higher than the control group (63% vs 79%, p<0.0001). At 3 months, the e-learning group still scored significantly higher (69% vs 79%, p<0.0001), with improved confidence scores (p<0.0001).
Conclusions This short e-learning resource significantly improved the paediatric prescribing skills of junior doctors. Outcomes were maintained at 3 months, suggesting the utility of low cost, low fidelity, educationally sound e-learning interventions. However, the direct impact on patient outcomes following this intervention has yet to be determined.
Statistics from Altmetric.com
Background
Errant prescribing is one of the most common errors in healthcare,1 contributing to 7000 deaths annually in the USA.2 A recent review3 identified systems such as electronic prescribing, computerised order entry systems and clinical pharmacy services as effective in reducing prescribing errors. Although a national undergraduate prescribing examination is being developed, current research suggests that graduates are at high risk of error,4 with trainees reporting low confidence5 6 and a desire for additional training. The General Medical and Medical School Councils convened a Safe Prescribing Working Group in 20097 to tackle this issue. Although one of their key recommendations was enhanced prescribing continuing medical education, a recent study8 suggests that this is not being delivered by paediatricians in the UK, although there are some promising reports in the literature.
What is already known on this topic
▶ Medication errors by junior doctors are a common source of adverse events in healthcare.
▶ Postgraduate education can improve paediatric prescribing, but poor reporting of the interventions and methodological weaknesses limit such research.
What this study adds
▶ A short, low cost and pedagogically sound e-learning intervention for junior doctors can significantly enhance paediatric prescribing skills.
▶ This improvement is maintained at 3 months after the intervention.
In one study, paediatric prescribing errors were halved after the introduction of a junior doctor prescribing tutorial.9 Other reported educational methods include problem based learning,10,–,12 interactive tutorials13 and computer games.14 All these studies share a key weakness: they do not report the educational interventions in sufficient detail to allow replication. There are also methodological limitations, with generally small sample sizes and no published randomised controlled studies within postgraduate training. A recent systematic review15 concluded there is only moderate evidence to inform the design of prescribing educational interventions for junior doctors.
Cook reviewed the evidence16 and found that e-learning is better than no teaching and similar to other forms of teaching. He argues17 that the various types of teaching are different but complementary, serving different purposes and functions suited to their own strengths. The question for medical educators is when and how to best employ e-learning. A paediatric prescribing intervention designed using e-learning could be standardised and yet individualised, convenient and time efficient. When designing e-learning courses, the need for software support, technical infrastructure and training for educators all must be considered, as these factors can have significant logistical and cost implications.
We set out to investigate how effectively a short, low fidelity e-learning course on paediatric prescribing could improve skills among junior doctors.
Methods
Ethics approval for this study was received from the University of Dundee.
Study design
We measured the effectiveness of the intervention on improvement in prescribing skills using a non-blinded randomised controlled trial.
Intervention
The intervention was designed in Microsoft PowerPoint 2007 and, using the Rapid E-learning Suite (v 5.6.5; Wondershare Software, Shenzhen, China), was converted to a self-contained flash program. This supported self-assessment exercises, video files and animations. The structure of the e-learning course is shown in online supplementary appendix 1. The programme was designed to be completed in 1–2 h. Paediatric pharmacists independently reviewed the intervention and the prescribing assessments and both were piloted among junior doctors. There were three different assessments of 10 questions, all structured similarly with questions in four categories: drug selection, prescribing calculations for children, discussing therapies and sources of error. The first assessment had 85 marks, while subsequent assessments had 100 marks each. The further assessments had additional elements added to prevent participant improvement due to a test–retest effect. An example question is shown in box 1 and the full assessments and marking guides are shown in online supplementary appendix 2.
Box 1 Example question from prescribing assessments
You are asked to make up an intravenous morphine bolus for a patient. You firstly check the prescription. If he weighs 21 kg and the dosage is 200 micrograms/kg, what is the dose?
4 Marks
It comes in strengths of 1 mg in 1 ml and 10 mg in 1 ml. Please select an appropriate strength and solution for dilution for making up the morphine bolus: (Delete as applicable) 1 mg in 1 ml or 10 mg in 1 ml.
1 ml of water or 10 ml of water or 1 ml 0.9% saline or 10 ml 0.9% saline.
3 Marks
Instructional objectives were derived from the Foundation and Royal College of Paediatrics and Child Health18 19 curricula (online supplementary appendix 3). Gagne's nine events of instruction were used to design the course structure.20 Cognitive load theory,21 which aims to prevent overload of working memory,22 23 was used to increase the learning efficiency of the intervention. These theories are presented in online supplementary appendix 4.
Recruitment
Volunteer trainees within the North Western Deanery Foundation School enrolled during July and August 2010. The school has approximately 1150 trainees. Exclusion criteria included: having a pharmacy degree; a history of working within the drugs industry; previously working as a doctor; and limitations on prescribing. It was calculated that a sample of 124 participants was needed to provide 90% power (p<0.05, two tailed test) to detect a 25% difference in scores. To allow for a 15%–20% drop-out at each assessment, a sample of over 200 was obtained.
A computerised random number generator allocated the 206 participants into control and experiment groups. Allocation in a 1:1 ratio was performed by providing assignments in sealed, light-proof envelopes, prepared by an independent researcher. These were opened sequentially once a participant had consented for inclusion. Participants were given identical baseline prescribing assessments. The intervention group were then sent the e-learning package and given 4 weeks to complete it. All participants were sent a second assessment and questionnaire. A final assessment was sent to all participants 8 weeks later.
Data analysis
The primary outcome measure was prescribing skill, measured by the total correct responses on each prescribing assessment. As the baseline assessment offered 85 instead of 100 marks, scores were converted to percentages to allow comparison to subsequent assessments. Secondary outcomes were prescribing confidence and satisfaction with prescribing education, measured by totalling Likert scores.
The researcher was blinded as to the allocation group of participants when marking assessments and performing analysis. The Student's t test was used for prescribing scores. A Wilcoxon signed rank test was used for secondary outcomes. Data were analysed in Statsdirect (v 2.7.8; StatsDirect, Altrincham, Cheshire, UK).
Results
Figure 1 shows the study profile, reported in line with the Consolidated Standards of Reporting Trials guidelines.24 There were 106 participants randomised to the control group and 99 to the intervention group, with demographics such as gender, age and previous degrees equally distributed between groups.
The baseline and postintervention scores for each of the outcomes are shown in table 1. There was no significant difference in baseline scores between the groups for any outcome measure. At 4 weeks after the intervention, there was a significant increase in the intervention group's prescribing scores and this was maintained at 12 weeks. Confidence and satisfaction scores in the intervention group also showed statistically significant increases at 4 and 12 weeks. There was no significant difference in the prescribing scores of the control group between the baseline and 12-week postintervention assessments (66% vs 68%, p=0.36).
Further analyses assessed the potential impact of participant characteristics. These excluded participants who had received prescribing teaching since recruitment, those with previous degrees and year two trainees, with no change in the results. A ‘per protocol’ analysis, removing all participants who did not complete all three assessments, again had no impact on the results, with similar differences in scores seen. Feedback on the e-learning intervention was almost universally positive.
Discussion
It is unsurprising that prescribing scores in the group who received the e-learning program had increased on re-assessment. The persistence of improvements at 12 weeks and the corresponding lack of improvement in the control group's scores are much more informative. In previous studies, longer term retention is rarely investigated. The fact that such a short module is able to produce a measurable improvement in prescribing skills at 12 weeks suggests the potential utility of such interventional design within early postgraduate training.
Given the high rates of error in paediatric prescribing by junior doctors, the use of this or similar interventions would seem advisable and could be easily implemented. If this became a mandatory element of induction, prescribing skills and ultimately outcomes for patients could be improved. The intervention was designed with a widely available and simple piece of software that allows educators to create most material in a familiar program. The finished e-learning course is easy to deliver as a self-contained intervention and does not require any new infrastructure or expenditure. As it is low fidelity, the need for continuing support is minimal and updating is easy. This manuscript and its supporting materials should allow educators to produce similar programs to be used in their own settings.
With the flurry of recent investment in e-learning at all levels of medical education, it is disappointing that so little of this work is guided by evidence. The authors maintain the view that the divide between theory and practice is limiting the effectiveness of much e-learning, with too much faith in the technology and too little focus on pedagogy. This study has attempted to challenge the role of ‘technology’ in technology-enhanced learning. Work is clearly needed to investigate other low fidelity e-learning interventions in medical education.
This study does have a number of limitations. Participants were volunteers, presenting an initial bias. There was also a large drop-out from recruitment to the first assessment, although a subsequent subgroup analysis of the participant demographics found no significant difference. Finally, this study has investigated improvements in skills and knowledge, but not the transfer of these into practice.
Previous work on patient safety issues has identified a gap between demonstrating improvement in skills and improvements in outcomes for patients.25 As this is the ultimate aim of all quality improvement projects, research investigating transfer of these skills into practice and reduced adverse events for patients is needed.
In summary, a short e-learning module, taking less than 2 h, is able to improve paediatric prescribing skills significantly. The intervention uses simple and low cost production tools with a sound educational grounding and should be reproducible by others. Improvements are maintained at 3 months and this suggests the utility of such an intervention to improve the skills of junior doctors.
References
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Files in this Data Supplement:
- Web Only Data - This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Footnotes
-
Competing interests None.
-
Ethics approval The University of Dundee Ethics Committee approved this study.
-
Provenance and peer review Not commissioned; externally peer reviewed.