Article Text
Statistics from Altmetric.com
Commentary on the review by McLellan
As recognised by the National Audit Office, in just five years, NHS Direct has established an impressive track record for customer satisfaction and patient safety, empowering patients to make better informed choices about their own healthcare. It has also clearly identified its potential to contribute to wider developments in the NHS. Building on this success the government is investing in NHS Direct to meet the anticipated growth in demand for its core service and to enable the service to play its part in the modernisation of out-of-hours services.1
In this issue, McLellan highlights appropriately the multi-channel aspect of provision via telephone, digital television, internet, and information kiosk.2 He also rightly identifies some of the challenges facing the service. The move to a single national provider from the current association of 22 separately hosted sites will play a significant role in meeting these challenges; there have already been several key developments in the service made possible specifically by this direction of development, as set out in Developing NHS Direct.1
NHS Direct introduced standard national reports for clinical indicators in the spring of 2003. These include the sorting of symptomatic calls, rates of use of algorithms to support assessment, and rates of selection of a different endpoint than that recommended by an algorithm. The reporting is done at site and individual clinician level. Together with other elements of NHS Direct’s Performance Framework, such as use of standard national call monitoring tools, this will form one of the most highly developed systems of performance monitoring of a large group of individual clinicians in the NHS.
As with any performance management, great care has to be taken to avoid unintended pitfalls, including misinterpretation.3 There are multiple factors that potentially contribute to variation in performance, such as the age/gender profile of callers, the proportions of core 0845 and GP out-of-hours work undertaken, and even the precise arrangements with different OOH providers. The “significant variability” stated by McLellan is therefore an over-simplification at this early stage after introduction. Work to clarify the degree to which apparent variation in the clinical indicator reports represents actual variation in performance is ongoing but clinical indicator targets, taking account of internal and external studies to date, were set in October and sites are beginning to manage clinical performance against these.
One example of the further work being done is the Gold Standard Sorting Study. Throughout July 2003, a major study was undertaken to benchmark NHS Direct against GPs in clinical risk tolerance for primary presentations (first point of contact) by telephone. One hundred and twelve GPs from across the whole of England and Wales have taken part in this study; the results will be submitted for publication and have been used to contribute to the sorting of indicator targets.
Managing peaks in demand is not new to NHS Direct. The call volumes for Christmas week 2002 were 50% higher than the average weekly volume in the autumn of 2002. There are a number of technical, process, and staffing developments already underway, some of which McLellan refers to, that will increase NHS Direct’s capacity and efficiency so that the increased call volumes over the next few years will not require a pro rata increase in staffing to that currently in post.
Increasing consistency, transparency, and system development are three of the challenges raised. This autumn will see the roll out of a standard national process of initial prioritisation of calls incorporated in the clinical support software. Prioritisation of calls has been done since NHS Direct started in 1998, but up to now has been done in multiple ways at different sites. NHS Direct has also introduced new governance arrangements for NHS CAS (Clinical Assessment System) since Easter and the peer review process has also recently evolved using lessons learned from earlier practical problems in engaging a wide range of expert opinion. Objectives for NHS CAS development now include the ability to analyse links between individual nodes in the algorithms and endpoints plus the ability to receive feedback on outcomes.
Paediatric calls are a very important part of NHS Direct’s work. In the first quarter of 2003–04, 24% of NHS Direct’s calls were for children aged up to 14 years (over two thirds of these for children up to 5 years). The lack of specific focus on paediatrics in the publications on NHS Direct referenced is not a reflection of “indifference” in the service. NHS Direct has worked with CHI on development of the child protection self audit tool for Boards released earlier this year, with the NSPCC on guidance and training, and is contributing to work on the NSF for Children. Sites have developed training in consultation skills specific to paediatrics with advice from paediatric departments and there is a precedent for developing these into a standard national form; for example, the “SCAN” training NHS Direct already uses to train staff in consultation skills for mental health issues. There is more to be done and the service is not complacent about this aspect of its work. I am grateful for Dr McLellan’s input to NHS Direct during his term as RCPCH Lead for NHS Direct and look forward to working with his successor to continue to address the challenges of this next period of development.
Commentary on the review by McLellan
REFERENCES
Linked Articles
- Community child health, public health, and epidemiology