Abstract
Background: In England, 42 Integrated Care Systems (ICS) have been responsible for organising and delivering integrated health and social care services since July 2022. ICSs bring together previously siloed organisations to deliver quality care in their local area. To understand the usefulness of integration, it needs to be systematically and comprehensively evaluated. This study aimed to understand how ICSs use measures to assess and improve quality in ICSs.
Methods: A case study method was used and four contrasting ICSs (e.g. by population size, location and experience of integrated working) were recruited. Semi-structured interviews (n=112) were conducted with senior staff from November 2021 to May 2022 (baseline, n=70) and January to May 2023 (follow-up, n=42). Interviews were conducted over MS Teams, audio-recorded and transcribed verbatim. A thematic analysis was undertaken to identify patterns and themes.
Results: There was consensus on the importance of using metrics to evaluate and improve quality in ICSs. Participants revealed that multiple, sometimes overlapping, health-oriented indicators existed but there was a relative lack of metrics of social care, public health and integration. Metrics tended to be assurance- or performance-focused with participants expressing an appetite to move to more outcome-based measurement and broader metrics.
To ensure the measures support quality, appropriate data analysis was necessary. Participants described a patchwork of analytical capacity with different external bodies (e.g. commissioning support units) responsible for data analysis. It was unclear how ICSs can best harness this external capacity or what ‘in-house’ capacity might be needed. Participants highlighted the need for triangulating information from different sources (e.g. data about complaints, patient experience or staff feedback) and types of data (i.e. using qualitative data to contextualise quantitative findings) to identify underlying issues and solutions.
Additional barriers with data collection (e.g. electronic records were not always available) and data sharing (e.g. information governance issues) were discussed. Creating integrated care records that offer the possibility of real-time data collection were still in progress. Many thought system dashboards that present data meaningfully should be developed to measure quality against system strategies.
Discussion: There was consensus among senior staff that metrics are important to assess and evaluate quality in ICSs and identified challenges that need to be overcome. Various ‘key metrics’ have been offered by external bodies. The Care Quality Commission (2022) published a single assessment framework that includes a broader array of outcomes across safety, experience, equity and access while also examining partnership working and public/patient involvement. The value or validity of any new measures remain to be tested in real world contexts. There are processes from research that could help identify metrics for ICSs such as the ‘core outcomes movement’. Information from patients and clinicians on meaningful outcomes for certain clinical conditions is combined with evidence on the measurement properties of these outcomes to identify a core outcome set for use across research trials (Williamson et al., 2012). Such an approach could help rationalise the number of measures and identify metrics that are most likely to drive change and improvement in integrated care systems.
