Diabetes Quality Programme - Self-Assessment and External Verification report 2018

This report provides data and insights from the first Self-Assessment and External Verification components of the National Children and Young People's Diabetes Quality Programme. With this baseline data, we will be able to make comparisons year on year, and reflect on what is working well and what changes need to be made to ensure we are all delivering the safest, most effective and patient centred diabetes care. This data also supports and informs the Peer Review process.
Last modified
13 November 2020

I hope [this report...] gives you encouragement to continue all your efforts towards improving your own paediatric diabetes unit performance

Dr Fiona Campbell, Clinical Lead

The following is a summary version. The full report can be downloaded below.


By Dr Fiona Campbell, Clinical Lead, NCYP Diabetes Quality Programme

The overall aim of the National Children and Young People’s Diabetes Quality Programme is to promote better healthcare outcomes and allow patients, NHS Providers and Commissioners to know how well individual Paediatric Diabetes Units are doing in succeeding in offering high quality care for all young people living with diabetes. An integral part of the programme is Self-Assessment against a set of clinical measures and External Verification of Self-Assessment validity. I am pleased to be able to write to you and introduce our first Self-Assessment and External Verification Report generated after the 2018 round of data collection and analysis.

This first report is an important part of the evaluation of the Peer Review Programme overall and provides us with valuable baseline data against which comparisons can be made year on year. Completing the online data submission has allowed every unit that has participated to reflect on how good they are at providing diabetes care, what is working well and what changes may need to be made to further improve care to ensure we are all delivering the safest, most effective and patient centred diabetes care.

This report contains the background to the Peer Review Programme and summaries the initial Self-Assessment and External Verification findings in both narrative and graphical format. Conclusions have been drawn and future plans laid out.

I hope you find that this report is interesting to read and gives you encouragement to continue all your efforts towards improving your own paediatric diabetes unit performance year on year.

Programme background

The National Children and Young People’s Diabetes Quality Programme (NCYPDQP) is a three-year integrated programme established in April 2018 in collaboration with the National Children and Young People’s Diabetes Networks of England and Wales. The NCYPDQP encompasses an annual, online, Self-Assessment (SA) against best clinical practice measures and a Peer Review process, alongside a series of Quality Improvement Collaboratives aimed to reach every participating service. It is designed to help multi-disciplinary teams (MDTs) in Paediatric Diabetes Units (PDUs) to transform the way they work to improve outcomes and deliver best practice care as efficiently as possible. The NCYPDQP is centrally managed by the RCPCH with the support of the 11 regional CYP Diabetes Networks and involves clinical teams across England and Wales.

The Self-Assessment measures are grouped by responsible team: MDTs (multidisciplinary teams); Hospital Trust / Health Boards; and Networks. Most Trusts and Health Boards have a single MDT which covers all sites delivering diabetes care, but some have more than one locality-based MDT. The MDT and Hospital / Health Board assessment is completed by the MDTs with sign off by the Clinical Lead and Medical Director (or nominated representative). Network Managers complete the Network measures with sign-off by their Clinical Chair or senior manager at the hosting organisation. Completion of the web-based self-assessment is predominantly via yes / no options with free text entry details of named roles and sign off designed to be simple and efficient. There is no requirement for the routine uploading of documentation – services are however expected to have the evidence available, if requested by the RCPCH staff team, as part of External Verification and Peer Review.


The measures are structured as follows:

  • Network measures – 34 elements in 12 measures
  • Hospital Trust / Health Board measures – 25 elements in 6 measures
  • MDT measures – 95 elements in 27 measures

Scoring is based on one point per element; so, for example, a service may score a total of 120 points based on 95 MDT elements and 25 Hospital / Health Board elements. The Self-Assessment is wholly user scored: findings are based on the self-reported scores and have not been triangulated with additional sources of information except for those subject to External Verification or unless otherwise specified.

The detailed list of Self-Assessment measures are available to download below.

External Verification

External Verification was conducted with 25 representative PDUs to provide a consistency check of the Self-Assessment process, identify any systemic errors and check whether appropriate evidence was available to support their submission.

The evidence was reviewed to determine to what extent it supported the Self-Assessment. The Measures (and elements) most commonly scored as compliant but which were not subsequently evidenced were:

  • M3 – Clinical guidelines
  • M4 – Pathways of care
  • M14 – Support for children in education policies
  • M15 – Evidence of annual screening
  • M16 – Transition
  • M20 – Lifestyle objectives
  • M25 – NPDA findings
  • M26 – Reviewing admissions
  • M27 – Reviewing data on patients who were not brought to appointments (across age bands)

In addition, in some instances submitted High HbA1c guidance was not in line with NICE guidance, stating 75mmol/mol rather than 69 mmol/mol.

Summary of initial findings: Hospital / Health Board Measures

  • Measure H4 (Point of Care testing) had greatest compliance with 97.9% of services reporting that they met the measure
  • Measure H2 (24 Hour telephone advice) had least compliance with 89.4% of services reporting that they met the measure
  • In 8% of services, download facilities for information from insulin pumps, continuous blood glucose monitors and blood glucose meters in time for discussion at the clinic appointment was reported to not be available in all clinics on all sites

Summary of initial findings: MDT measures

  • MDTs reported 100% compliance with Measure M17 (at least one CYPD MDT representative participating in 75% of CYPD Network meetings)
  • Measure M1 (MDT Core Membership) details the nine core roles within the team. The most frequently reported gaps in provision were Inpatient Ward Link Nurse and Lead Clinical Psychologist, followed by secretarial / administrative support
  • Measures relating to Transition / Transfer were most frequently reported as unmet

Summary of initial findings: network measures

  • 100% of Network reported they met Measure N1 (Network configuration), Measure N3 (Frequency of Network Meetings); Measure N4 (Annual Report) and Measure N5 (Annual Service Development Proposals)
  • Patient Experience (Measure N9) was reported to be met by the fewest Networks
  • All Networks reported having at least two patient/parent/carer representatives as part of the Network Membership


Nearly 98% have Point of Care testing available at all sites

The most challenging measure for Hospital and Health Boards to meet was 24-hour telephone advice across all elements

The first round of the Self-Assessment and External Verification workstreams have provided a baseline against which future activities can be measured and compared.

The Self-Assessment is completed by nominated individuals and does not require uploading of documentation to evidence answers – this is corroborated through the External Verification process for identified units and, later, during Peer Review Visits. Outcomes of the early phases of the Peer Review visits will enable an exploration of the reasons for changes to the individual scoring of measures prior to and at the Peer Review visit, as well as provide the opportunity to gain a deeper understanding of the challenges leading to those measures which are commonly unmet and identify good practice where measures are met or exceeded.

Next steps

The second and third Annual Self-Assessment in June 2019 and 2020 will seek an update from participating MDTs, Hospitals / Health Boards and Networks.

These rounds will allow:

  • A comparison of outcomes from 2018-2019 with 2019-20 to monitor change
  • Further exploration of key themes arising from the Self-Assessment and External Verification analysis
  • Inclusion of information gained during the course of Peer Review visits to support verification and identification of themes for analysis
  • Exploration of the reasons for changes to the scoring of reported self-assessment measures between 2018, 2019 and 2020
  • Comparative analysis focused on specific measures and service context to further understand the most common challenges to delivering good clinical care and areas of good practice worthy of sharing