Skip to main content

Publication, Part of

Learning Disability Services Monthly Statistics, AT: October 2024, MHSDS: September 2024

Official statistics, Experimental statistics

Current Chapter

Data quality statement - MHSDS


Error affecting Table 4.6

An error affecting the “Patients with planned transfer/discharge date of which authority is aware” data by age and patient category within table 4.6 has been identified.  This data did not correctly reflect the change to this question in AT v4 from 1 April 2024 and has been withdrawn as a result

The total number of patients with planned transfer/discharge date of which authority is aware is available in table 3.2 and is unaffected by the error which affected table 4.6.

The affected section of table 4.6 has been reinstated from the January 2025 publication (December 2024 data) onwards.

16 January 2025 09:30 AM

Changes to AT dataset from April 2024

Significant changes were made to the AT dataset (v4.0) from 2 April 2024. Some of these changes impacted on the breakdowns available in the AT data tables and these have been footnoted where relevant.

Full details about these changes can be found within the current release of the AT information standard

21 November 2024 09:30 AM

Decommissioning of AT and MHSDS comparators file

Up  until the edition of the Learning Disability Services publication released in April 2024, a comparators file showing the headcounts of inpatients across AT and MHSDS for each provider, as well as the headcount at England level, was included. We have decommissioned this file from the May 2024 release onwards to streamline our processes as key figures can be calculated using the data files we routinely publish as part of the Learning Disabilities and Autism publication series.

Full details are available within the Data Quality - MHSDS page under 'Accuracy and reliability'.

21 November 2024 09:30 AM

Data quality statement - MHSDS

Purpose of document

This data quality statement aims to provide users with an evidence based assessment of quality of the statistical output included in this publication. 

It reports against those of the nine European Statistical System (ESS) quality dimensions and principles appropriate to this output. The original quality dimensions are: relevance, accuracy and reliability, timeliness and punctuality, accessibility and clarity, and coherence and comparability; these are set out in Eurostat Statistical Law. However more recent quality guidance from Eurostat includes some additional quality principles on: output quality trade-offs, user needs and perceptions, performance cost and respondent burden, and confidentiality, transparency and security.

In doing so, this meets NHS England’s obligation to comply with the UK Statistics Authority (UKSA) code of practice for statistics and the following principles in particular:

  • Trustworthiness pillar, principle 6 (Data governance) which states “Organisations should look after people’s information securely and manage data in ways that are consistent with relevant legislation and serve the public good.”
  • Quality pillar, principle 3 (Assured Quality) which states “Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely.”
  • Value pillar, principle 1 (Relevance to Users) which states “Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported.”
  • Value pillar, principle 2 (Accessibility) which states “Statistics and data should be equally available to all, not given to some people before others. They should be published at a sufficient level of detail and remain publicly available.”

Relevance

This dimension covers the degree to which the statistical product meets user need in both coverage and content.

This publication comprises a report which has been produced from learning disabilities and autism service providers’ MHSDS final data for the reporting period covered by this publication page. The information provided in this publication series is the timeliest that is available from providers for LDA services in England.

All providers of NHS funded specialist mental health and learning disability services should submit to the MHSDS. However, at present not all independent sector providers are making submissions, and this has an impact on the completeness, particularly in areas such as inpatient care and forensic services, where the independent sector provides much of the NHS funded care. A coverage report is included within the main MHSDS publication showing the number of providers submitting each month and number of records submitted. When an organisation starts or ceases to submit data this can affect overall record numbers.

The main MHSDS publication and associated files can be found here:

https://digital.nhs.uk/data-and-information/publications/statistical/mental-health-services-monthly-statistics

For LDA, the number of providers submitting data is lower than that recorded in as providing services in the Assuring Transformation publication. To reflect this low coverage, England/national totals are not displayed for LDA measures; instead the ‘Total of submitted data’ is presented. Caution is advised when interpreting these data as they may under-represent the LDA services at a national level.


Accuracy and reliability

This dimension covers, with respect to the statistics, their proximity between an estimate and the unknown true value.

Experimental statistics

The statistics in this publication are marked as experimental and may be subject to further change as we develop our statistics. The classification of experimental statistics is in keeping with the UK Statistics Authority’s Code of Practice. Experimental statistics are new official statistics that are undergoing evaluation. They are published in order to involve users and stakeholders in their development, and as a means to build quality at an early stage. The UK Statistics Code of Practice states that “effective user engagement is fundamental to both trust in statistics and securing maximum public value…” and that as suppliers of information, it is important that we involve users in the evaluation of experimental statistics.

Note: while the impact from the cyber incident continues these statistics will be badged as Management Information.

Submission validations

The MHSDS is a rich, referral level dataset that records packages of care received by individuals as part of referrals into treatment within NHS funded, learning disabilities and autism services and these packages of care vary widely. This means that each record contains different elements of the dataset. Therefore, no single approach can measure the completeness and accuracy of the data collected and reported nationally. However, NHS England provides a number of different reports at different stages in the data flow to ensure that the submitted data reflect the services that have been provided.

At the point of submission:

  • Providers receive immediate feedback on the quality of their submission, including detailed Data Summary Reports about coverage, volume, code validity and data consistency. Providers have the opportunity to re-submit data up to the deadline and to send a refresh submission one month later.

On receipt of processed data by NHS England:

  • Where there are concerns about data quality we contact providers directly so that any issues with local data extraction processes can be addressed for a future submission. These checks are currently limited to key elements of the dataset. Additional checks will be developed as part of future submissions to the extent where they offer the same level of coverage as those previously available for MHLDDS submissions. We also issue individual monthly Data Quality Notices to all providers highlighting key data quality issues.

Data quality reporting

As part of the main MHSDS publication national and organisation level data quality measures are shown that validate a selection of key data items by provider. These show the proportion of records as counts and percentages which have ‘valid’, ‘other’, ‘default’, ‘invalid’ and ‘missing’ values for key elements of the dataset, such as Team Type and Primary Reason for Referral. A coverage report shows the number of providers submitting data each month and the number of records by provider and by table. These elements will be expanded upon in future submissions.

To support NHS England’s change in Care Treatment Review (CTR) policy, from December 2017, we have included some data quality analysis on the Care Treatment Review (CTR) data within the LDA reference tables We have also included a data quality tab which includes the counts below on the use of the three SNOMED codes that relate to CTRs.

  • The number of providers who are submitting the CTR SNOMED codes
  • The number of each CTR SNOMED code being used within the month

Care and Treatment Reviews (CTRs) were developed to improve the care of people with learning disabilities, autism or both in England with the aim of reducing admissions and unnecessarily lengthy stays in hospitals and reducing health inequalities.

Comparing MHSDS and Assuring Transformation data

Previous Learning Disability Services publications have included a comparators file showing the headcounts of inpatients across AT and MHSDS for each provider, as well as the headcount at England level. We have decommissioned this file to streamline our processes as key figures can be calculated using the data files we routinely publish as part of the Learning Disabilities and Autism publication series.

To locate the figures previously presented as part of the comparators file, please follow these notes:

On the monthly publication pages scroll down to Data Sets, where there are links to pages for Assuring Transformation and MHSDS:

·  Learning disability services monthly statistics from Assuring Transformation dataset: Data tables

·  Learning disability services monthly statistics from MHSDS: Data tables

In each case, choose the excel file labelled “Data Tables” not “csv”.

Assuring Transformation data is published each month with data as at the end of the previous month (i.e. in April data extracted at the end of March was published).  However, MHSDS is published for the previous month, i.e. in April data as at the end of February was published.  So when comparing the datasets, the penultimate month in the AT data file should be compared with the latest month in MHSDS.

The following example is based on the April publication (AT March 2024 MHSDS February 2024)

In the AT data tables, Table 1.1 includes information on:

In MHSDS, the equivalent data is recorded in Table 2 as the number of ward stays.

In MHSDS data on inpatient ward stays, admissions and discharges is also shown broken down by provider in Table 6.

Table 5.3 of the AT data tables shows the number of inpatients at the end of each month broken down by provider.

The comparators data file historically removed respite care episodes from the data for both AT and MHSDS, to improve the accuracy of the comparison.  The number of respite care episodes are shown in both the MHSDS (Table 2) and AT (Table 2.2) data tables at England level so it is possible to remove these figures from the total inpatient numbers.  However, the number of respite care episodes recorded in both MHSDS and AT in recent publications has been very low (around 15) and similar between the datasets, so a comparison of the total numbers, without removing respite, should be equally valid.   

For further details about the Assuring Transformation collection please visit:

https://digital.nhs.uk/data-and-information/data-collections-and-data-sets/data-collections/assuring-transformation/reports-from-assuring-transformation-collection

Interpreting uses of restrictive interventions in inpatient services

The MHSDS is derived from administrative systems rather than being a specific purposeful collection. The quality and completeness of particular data items from particular data providers is dependent on the use to which the data has so far been put. Due to this, data items which are used regularly for analysis or have been the focus of particular attention will be of better quality than less frequently used items, or those for which in depth analysis has not yet taken place. Monthly figures on the use of restrictive interventions in inpatient learning disabilities and autism services were first published for January 2019. As such the data used to derive these falls into this latter group and may be of lower quality than other data used in this publication.

Further assessment of the quality and completeness of these experimental statistics will follow this publication. This will use the statistics presented here as the basis of discussions with service providers to understand any issues, and to guide the development of the methodologies used to create statistics on the use of restrictive interventions in these services.

The data used to derive these figures may contain duplicates. Multiple interventions with identical dates and details (intervention type and duration) for the same individual have been identified. Currently it is unknown if these values are duplicates, record errors or genuine separate incidences therefore no data has been excluded. These potential issues may lead to the number of restrictive interventions shown in this publication being unreliable. As such these figures should be used with caution.

Statistics showing the number of people subject to restrictive interventions will not be affected by duplication and so are more reliable than statistics on the number of restrictive interventions. However, statistics on the number of people subject to restrictive interventions are still subject to other potential quality and completeness limitations. These statistics should be used in light of these limitations.

There has been a general increase in quality and completeness of restraints information submitted by providers over time. Over the COVID period, there has been a notable increase in the number of restraints for some providers due to more shorter incidences of restraints needed per patient that have been recorded separately (as per guidance). Therefore, caution should be taken when interpreting the data.


Timeliness and punctuality

Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.

The MHSDS LDA reports have been produced within three months of the end of the reporting period and five weeks of the submission deadline.

The submission deadlines for MHSDS are published here:

digital.nhs.uk/data-and-information/data-collections-and-data-sets/data-sets/mental-health-services-data-set/mental-health-services-data-set-specifications-and-guidance


Accessibility and clarity

Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.

Re-use of our data is subject to conditions outlined here:

https://digital.nhs.uk/about-nhs-digital/our-work/keeping-patient-data-safe/gdpr

Definitions for measures included in this publication are available in the accompanying metadata file. Terminology is defined where appropriate. These definitions will be developed, and further guidance provided in future editions of this publication series when needed.

Full details of the way that MHSDS returns are processed, which will be of use to analysts and other users of these data, are provided in the MHSDS User Guidance, available on the legacy NHS Digital website:

https://digital.nhs.uk/data-and-information/data-collections-and-data-sets/data-sets/mental-health-services-data-set/mental-health-services-data-set-specifications-and-guidance


Coherence and comparability

Coherence is the degree to which data which have been derived from different sources or methods but refer to the same topic are similar. Comparability is the degree to which data can be compared over time and domain.

From October 2016, the MHSDS LDA inpatient data has been compared to the Assuring Transformation collection. There is a slight difference in scope between these two data collections. The MHSDS data is from providers based in England and includes care provided in England but may be commissioned outside England. Whereas the Assuring Transformation data are provided by English commissioners and healthcare will typically be provided in England but also includes data on care commissioned in England and provided elsewhere in the UK.


Trade-offs between output quality components

This dimension describes the extent to which different aspects of quality are balanced against each other.

Although the collection of LDA data via MHSDS commenced in January 2016, some providers continue to experience issues making a comprehensive submission within the permitted timescales. We expect a more complete and accurate picture to emerge over time. This analysis presents an early view and is subject to caveats both in terms of the completeness of the submission, particularly for services that have only come within scope of the dataset since 1 January 2016, and the limits of the data that could be provided about pathways into services to support monitoring of waiting times.

The format of this publication has been determined to enable timely reporting of key measures while adjusting the scope of analysis to be achievable within NHS England resources and production time. Further work on data quality issues with providers is planned to help increase the usefulness and usability of these statistics for different users. Though this work, we hope to support discussions with and between providers and commissioners about caseload and activity to help narrow the differences between the two data sources, so we can look to move to one official source of LDA information in the long term.


Assessment of user needs and perceptions

This dimension covers the processes for finding out about users and uses and their views on the statistical products.

The purpose of the MHSDS LDA monthly reports is to provide learning disability and autism service providers, commissioners and other stakeholders with timely information about caseload and activity. This is intended to support changes in commissioning arrangements as services move from block commissioning to commissioning based on activity, caseload and outcomes for patients.

We undertook a consultation on our adult mental health statistics during 2015 and published the results in November 2015. Changes to the MHLDS Monthly Reports that were previously published from MHLDDS are described in a Methodological Change Paper. The introduction of statistics to support the monitoring of waiting times is in line with the ambitions set out in the NHS England’s Five Year Forward View for Mental Health and we will introduce further waiting time measurements in line with priorities identified with interested parties.

Regular consultation with customers and stakeholders is undertaken to ensure that developments introduced to the publication meet their requirements.


Performance, cost and respondent burden

This dimension describes the effectiveness, efficiency and economy of the statistical output.

The dataset preceding MHSDS (MHLDDS) was identified as the data source to replace others in the Fundamental Review of Returns programme designed to reduce burden on the NHS. As a secondary uses data set it intends to re-use clinical and operational data from administrative sources, reducing the burden on data providers of having to submit information through other primary collections.


Confidentiality, transparency and security

The procedures and policy used to ensure sound confidentiality, security and transparent practices.

Submissions have been processed in line with the rules described in the Technical Output Specification for the dataset using a fully assured system that pseudonymises individual identifiers. As for all NHS England publications, the risk of disclosing an individual’s identity in this publication series has been assessed and the data are published in line with a Disclosure Control Method for the dataset approved by the NHS England’s Disclosure Control Panel.

Please see links below to relevant NHS England policies:

Statistical Governance Policy

Freedom of Information Process

https://digital.nhs.uk/about-nhs-digital/contact-us/freedom-of-information

A Guide to Confidentiality in Health and Social Care

https://digital.nhs.uk/about-nhs-digital/our-work/keeping-patient-data-safe/how-we-look-after-your-health-and-care-information

Privacy and Data Protection

https://digital.nhs.uk/about-nhs-digital/privacy-and-cookies

GDPR

https://digital.nhs.uk/about-nhs-digital/our-work/keeping-patient-data-safe/gdpr


Last edited: 14 January 2025 10:51 am