Article Text

Download PDFPDF

Managing difficult issues: does training have any impact?
  1. Deborah G Murdoch-Eaton
  1. Correspondence to Professor Deborah G Murdoch-Eaton, Department of Paediatrics/Medical Education, School of Medicine, Leeds Institute of Medical Education, University of Leeds, Leeds LS2 9NS, UK;d.g.murdoch-eaton{at}leeds.ac.uk

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Training is not the only issue to consider when evaluating practice; however, the burgeoning number of training courses attended surmises that attending purposively designed educational interventions will lead to greater learning, and through transfer of skills into the workplace subsequently positively alter outcomes. The paper from Wake et al1 poses some uncomfortable questions looking at the impact of self-perceptions of competence and training on the management of obesity. This paper's importance is in considering clinical practice when addressing a significant health issue, and illustrates the difficulties around assessing the impact of training when tackling difficult and challenging issues.

The impact of training is most frequently evaluated simplistically around the perceived quality of the session (Kirkpatrick's first level). Addressing whether training alters the learner's behaviours is also relatively easily undertaken by evaluating change in practice, and additionally by alignment with measurable learning outcomes from the training course. However, evaluation of impact upon patient care and outcomes is where the major challenge arises; training being only one relatively small piece of the jigsaw involved in influencing outcomes in complex areas, such as in obesity management, where other influences including social, cultural, economic and sometimes even ethical dimensions also need to change.

Participation in continuing professional development implies identification and selection of training by individual clinicians training needs; however, this is driven increasingly additionally by a requirement for evidence of completing designated ‘mandatory’ training, dictated by local and/or national professional bodies or employers. Engagement with learning and the impact on subsequent performance have many learner-centred influences, not least of which is this important recognition of the need for training and self-regulation. Continuing professional development training will inevitably include aspects that learners may not perceive a pressing need for, attended with some reluctance and, even if acknowledged to be important, may be approached with variable engagement. The study of Wake and colleagues1 demonstrated a low recognition of the need for training, and in those who had specific training, little effect on diagnostic or management skills. Other important influences on learning are recognised to include factors such as the imminence of the learning need (‘how soon will I need this information, or be tested on it?’) and the consequences (‘whats the worst that can happen if I dont know this?’) This indicates the potential for limited learning from any training by a learners’ mind set approaching the training materials, irrespective of the quality.

Engaging teachers and well-designed sessions can influence outcomes, even with reluctant learners, but it does require more than a didactic, brief lecture! A repetitive learning method with feedback and immediate testing requiring demonstration of competence for example underpins much skill training, as illustrated in the approach taken in life support training. Simulation has been invested in by many skills training centres; however, it is important to recognise that the benefit of high fidelity (and expensive) simulation over low fidelity simulation in transfer of learning to actual patient care situations has not been proven.2

Training in clinical reasoning is a key fundamental of effective practice, but similarly poses real challenges in training methodology and utility. Formal structured sessions usually use clinically complex conundrums, broken down into manageable chunks and addressed in a logical progression sampling areas of risk. However, is this truly an authentic representation of the complexities posed within the reality of difficult and challenging complex clinical management? Effective skills in applied clinical reasoning are around integration, and in deliberative reasoning particularly around uncertainty and situational modification and application in practice. Educators’ style of delivery and design of training (in both standalone courses and in workplace based clinical practice observation and feedback) need to enhance the development of aptitudes that address and recognise the holistic clinical practices in contextually relevant and real situations that facilitate recognition and integration of many sources of information, influences and sources of impact.

The educational research around health professional training in clinical reasoning also indicates the limitations of training in complex and difficult areas of clinical practice. There are some indications that one can train for a task effectively but not for the action taken, that is, application of the appropriate clinical diagnostic decision making process.3 This aligns with research into mindfulness training and motivational interviewing, and reminds us the limitations of training outside of the workplace. Effective clinical practice is not just about reaching a conclusion, but subsequent decision making processes, including the rightness of the decision, at that time, for that particular situation, and in a holistic context and environment more likely to influence change, challenge assumptions and culture, and enhance individual or group motivational forces.

Effecting changes in behaviour are often the most challenging, often culturally and situationally dependent, and little evidence exists for effective (educational) interventions.4 Investing in further training in our current approach, particularly away from the authentic workplace, would seem depressingly probably of little added value. In challenging ‘heartsink’ areas of practice, perhaps we should be investing more in training around resilience. Resilience can be defined as ‘a dynamic capability which can allow people to thrive on challenges given appropriate social and personal contexts’.5 This has potential to impact upon effectiveness when dealing with situations in which the paediatricians’ impact and influence may seem to be constrained to a small piece of the jigsaw, such as in management of obesity. Enhancing clinicians’ capacity to respond positively to challenging, and at times disheartening, experiences in clinical practice has some attraction and is a potentially underdeveloped area of importance for effective professional practice.

References

Footnotes

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Linked Articles