Article Text

Download PDFPDF

What can a participatory approach to evaluation contribute to the field of integrated care?
  1. Laura Eyre1,
  2. Michael Farrelly2,
  3. Martin Marshall1
  1. 1Reserach Department of Primary Care and Population Health, University College London, London, UK
  2. 2School of Histories, Languages and Cultures, University of Hull, Hull, UK
  1. Correspondence to Dr Laura Eyre, Research Department of Primary Care and Population Health, University College London, Ludwig Guttman Health and Wellbeing Centre, 40 Liberty Bridge Road, Olympic Park, Stratford, London E20 1AS, UK; l.eyre{at}ucl.ac.uk

Abstract

Better integration of care within the health sector and between health and social care is seen in many countries as an essential way of addressing the enduring problems of dwindling resources, changing demographics and unacceptable variation in quality of care. Current research evidence about the effectiveness of integration efforts supports neither the enthusiasm of those promoting and designing integrated care programmes nor the growing efforts of practitioners attempting to integrate care on the ground. In this paper we present a methodological approach, based on the principles of participatory research, that attempts to address this challenge. Participatory approaches are characterised by a desire to use social science methods to solve practical problems and a commitment on the part of researchers to substantive and sustained collaboration with relevant stakeholders. We describe how we applied an emerging practical model of participatory research, the researcher-in-residence model, to evaluate a large-scale integrated care programme in the UK. We propose that the approach added value to the programme in a number of ways: by engaging stakeholders in using established evidence and with the benefits of rigorously evaluating their work, by providing insights for local stakeholders that they were either not familiar with or had not fully considered in relation to the development and implementation of the programme and by challenging established mindsets and norms. While there is still much to learn about the benefits and challenges of applying participatory approaches in the health sector, we demonstrate how using such approaches have the potential to help practitioners integrate care more effectively in their daily practice and help progress the academic study of integrated care.

  • Health services research
  • Qualitative research
  • Evaluation methodology
  • Quality improvement methodologies

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

The promotion of a more integrated approach to care delivery is a major priority in many countries. Poorly integrated care wastes resources, frustrates patients and those who deliver care, can endanger patient safety and is not sustainable given the demographic changes and financial constraints experienced by most health systems.

But the challenges of integration are easier to talk about than to act upon and the research evidence in the field does not appear to have had great impact on practice.1–3 Researchers have tended to focus on whether interventions to improve care integration ‘work’ and the findings are often not convincing.4 In attempting to answer the seductive ‘does it work’ question, there is a risk that researchers end up presenting simple answers to complex questions and producing generalised knowledge that is difficult to localise and therefore not useful. Where research has recognised the complexity of promoting integrated care, for example emphasising the importance of rigorous implementation and of understanding context, there has been insufficient focus on solutions that are actionable by practitioners.5–8 As a consequence, research findings often have little utility for practitioners and so a non-evidence-informed approach to integration persists.

In this paper, we describe how we used a methodological approach that attempts to address this challenge. The approach is based on the principles of participatory research9 using a practical approach to participatory evaluation10 called the researcher-in-residence (RiR) model.11 ,12 We start by summarising the established evidence base in the field of integrated care and then describe the participatory approach that we used. We then present a case study to illustrate our approach and discuss where and how we think that what we did adds value to the field of integrated care.

What the established evidence tells us about integrated care

There is a growing consensus that integrated care is desirable but difficult to do in practice.1 ,8 ,13–19 Positive outcomes can be achieved but they are usually small in scale, inconsistent and highly dependent on local context.1–3 ,20 ,21 For example, Powell and colleagues carried out a systematic review of the literature relating to care coordination to assess the effectiveness of a variety of different strategies from a primary care perspective.4 The authors found that only about half of the included studies reported better patient outcomes, about one-third reported improved patient experience and less than one in five of the small number of studies that examined economic outcomes demonstrated any useful cost savings. A growing number of studies have examined the processes underpinning care integration7 ,8 ,22 and the associated facilitators and barriers.5 ,6 Successful integration is highly dependent on local contextual factors and the literature highlights that there are ‘no universal solutions or approaches that will work everywhere’.23

Given the funding and hope that is being invested in integrated care, this evidence does not provide much reassurance that health services are going to be transformed by current approaches to integrating care.

An overview of participatory research

By bridging the gap between research and practice, participatory approaches have the potential to address some of the challenges evident in the current evidence base for integrated care.6 ,8 ,23–30 Participatory research is characterised by a desire on the part of stakeholders to solve practical problems and a commitment from researchers to substantive and sustained collaboration with practitioners.9 The focus is on initiating change through reflection, the promotion of greater understanding and shared learning. Most importantly, participatory research embraces a commitment to promoting agency and establishing consensus through negotiation.12

A common way of achieving an effective partnership is to embed researchers with the people or organisations that in traditional models of research would have been the ‘objects’ of the study.31 ,32 A recent narrative review of the role of embedded research in quality improvement32 identified four characteristic features of embedded research: the researcher is usually affiliated to an academic institution as well as an organisation outside of academia and works between the two; within the affiliated organisation the researcher develops relationships with staff and is seen as part of the team; research is conducted in collaboration with the affiliated organisation and knowledge is coproduced; and the researcher builds internal research capacity within the host organisation.

These characteristics lend themselves to the field of integrated care, where organisations are struggling to implement programmes in a way that will improve patient outcomes and experience and help to control costs. By mobilising established knowledge and creating new understanding, which are both locally relevant and potentially transferable, participatory approaches provide an opportunity to bridge the gap between evidence and practice.

An illustrative case study

Box 1 summarises a case study illustrating how we used a participatory approach to improve the effectiveness of an integrated care programme. The Waltham Forest and East London (WEL) integrated care programme is a 5-year initiative, which started in 2012, and involves health and social care organisations located across three localities in East London, UK.

Box 1

The Waltham Forest and East London (WEL) integrated care programme

The Waltham Forest and East London (WEL) integrated care programme began in 2012 and was one of the 14 ‘pioneers’ charged with leading the development of integrated care to inform national policy and create useful learning for other integrated care initiatives across England.38 The collaboration brought together commissioners, providers and local authorities covering the area served by Barts Health NHS Trust, an acute trust with three main hospital sites serving a population of almost a million. The geography covered the East London local government boroughs of Waltham Forest, Tower Hamlets and Newham.

The partners came together to build a model of integration that looked at the whole person—his/her physical health, mental health and social care needs. The vision was for people to live well for longer, leading more socially active and independent lives, reducing admissions to hospital and enabling access to treatment more quickly. The focus was on empowering individuals by providing responsive, coordinated and proactive care, and ensuring consistency and efficiency across physical and mental health and social care. Interventions included population segmentation and the identification of high-risk groups, the use of care plans and patient navigators, the promotion of multidisciplinary team working and the integration of information systems.

The programme focused on the top 20% of patients who were considered to be most at risk of an admission to hospital in the following 12 months. The cohort was targeted in a phased approach, beginning with those at very high risk of hospital admission and working downwards to cover the full 20% over a 5-year period.

Our approach

Starting in September 2014, we carried out a 24-month qualitative and formative participatory evaluation in partnership with the WEL integrated care programme stakeholders, who included providers and commissioners (payers) of care, clinical staff and health and social care managers. The agreed aim was to help the stakeholders to better deliver the aims and objectives of the integrated care programme. The evaluation protocol is published elsewhere.33

The participative evaluation used the ‘RiR’ model,11 ,12 which has three defining characteristics. First, the researchers spend a significant proportion of their time embedded in operational or programme teams and a relatively small amount of time in their academic institution. As a core member of the team, they share responsibility for delivering the team's objectives, working alongside managers, clinicians and service users. Second, the researchers bring new skills and expertise to that team, including an understanding of the empirical evidence relevant to the tasks in which they are involved, an ability to use theory to guide change and the skills to evaluate interventions using a range of data sources and types of data. The role therefore involves both mobilising established knowledge and creating new evidence for local use and for wider dissemination. The balance between these two functions may vary by project and may alter as the project progresses. Third, and most importantly, the researchers are both willing and able to negotiate their expertise, integrating and where necessary negotiating knowledge created using the scientific method with the expertise and knowledge used by practitioners.

Data were generated iteratively using ethnographic methods including 225 hours of formal participant observations of meetings and events, informal participant observations of usual working practices, conversations and interactions, a documentary analysis and a total of 124 semistructured interviews carried out in three phases during the 2-year evaluation.

Analysis and interpretation of the data were ongoing and iterative throughout the evaluation using a framework analytical approach34 comprising three clear stages: data management (reviewing, sorting, coding and synthesising the data), descriptive accounts (identifying key themes, mapping the range of themes and developing classifications) and explanatory accounts (building theories and explanatory accounts from the patterns and themes emerging from the data). Findings were triangulated between different data sources. Concepts from critical discourse analysis were also drawn on to inform the iterative processes of data generation and analysis.35

Knowledge and learning were interpreted and negotiated jointly between the research team and the programme stakeholders using formal and informal approaches including one-to-one conversations, presentations at meetings and events, sharing written reports, emails, telephone calls and regular interactive workshops.

Where does a participatory approach add value?

The following section describes how the participatory approach that we adopted added value to the WEL integrated care programme in four ways, each of which is likely to be transferable to other similar initiatives:

The approach engaged stakeholders with the use of evidence and the benefits of evaluation

Many practitioners do not have a detailed understanding of relevant research evidence, rarely have an opportunity to interact with researchers and are sometimes sceptical about whether academics can be practically useful.36 However, in this initiative, most stakeholders were receptive of the researcher and good levels of engagement were maintained throughout the evaluation. The researcher enjoyed full access to stakeholder organisations and relevant meetings, was added to all relevant distribution lists and was provided with access to appropriate documentation as and when necessary. Formal and informal meetings to discuss and jointly interpret existing evidence and emerging findings were popular and well received.

The benefits of this high level of access were acknowledged by the participants, though some recognised that moving from having new insights to taking different actions remained a challenge:[We] have found the RiR and her evaluation an important and integral part of the development of [integrated care] … The qualitative information secured through wide reaching and deep stakeholder engagement has significantly helped with our understanding of how the programme is perceived, its strengths and weaknesses and has directly shaped the development of integrated care moving forward. (Middle manager, commissioner organisation)I think it’s [working with the researcher] been incredibly useful, it’s given us insights that we simply wouldn’t have been able to access in any other way, particularly as it relates to a really detailed and nuanced understanding…….The challenge, which I don’t think we’ve properly risen to, is using the insights intelligently to drive further transformation. (Senior manager, provider organisation)

The researcher held several meetings throughout the evaluation with practitioners who sought advice and guidance relating more broadly to the methods and benefits of evaluation. These interactions provided valuable opportunities to build interest and capacity in evaluation among local stakeholders.

The approach encouraged new insights that stakeholders had not previously appreciated

The evaluation generated a number of new insights relevant to the implementation of integrated care, which might not have been appreciated and certainly had not been discussed by stakeholders prior to the evaluation.

For instance, despite a unified belief that integrated care was ‘the right thing to do’, the evaluation exposed a disconnect between the high-level management and leadership of the programme and the operational delivery of integrated care:…we haven't paid enough attention to what's going on on the ground. We pay a lot of attention to the strategic level, i.e. capitation, contracting and reimbursement, communications, etc but we're not paying as much attention to the delivery of IC on the ground. (Operational manager, provider organisation)

The evaluation participants, supported by the embedded researcher, identified and explored possible reasons for this. First, the WEL programme was designed originally by senior managers and external consultants with a ‘top down’ approach and relatively little input from front-line staff. Second, in the implementation phase there was inadequate engagement and communication with clinicians. Third, the motivation of senior leaders and managers and front-line staff to engage with the initiative seemed to be different, with front-line staff primarily motivated by opportunities to improve patient experience, improve quality of care and develop better working relationships with colleagues while senior leaders were primarily motivated by the imperative to make financial savings and to reduce hospital admissions. Fourth, there was an inadequate focus on the ‘organisational development’ aspects of change—in particular developing trusting relationships and facilitating changes in staff culture—an issue that will be returned to in the next section.

Additionally, the large number of different and sometimes contradictory policy initiatives at play in the context of the integrated care programme was distracting and created confusion and ‘system inertia’ among staff responsible for the delivery of the programme.We’ve had quite a lot of new policy [with] quite sizeable funding attached to it with a view to mobilising change. But certainly what I feel like it’s created is system-wide inertia. Because everyone’s just so busy trying to figure out who’s doing what, how this fits with something else, are we duplicating or are we not and who’s best placed to lead on this that actually nobody does anything, or that instead of doing something people are spending much longer figuring out the practicalities of how things link up. So it delays development and innovation. (Middle manager, commissioner organisation)

With so much going on, staff had little capacity to engage with new initiatives and the easiest reaction to new ideas was simply to ignore them.

The approach stimulated insights that stakeholders were familiar with but had not worked through

In addition to these new insights, the evaluation generated a number of insights, which were familiar to stakeholders but which benefited from having a mirror held up to them. First, it became increasingly clear that the programme leaders were focusing on the tangible elements of integrated care, particularly establishing new structures and governance arrangements, at the expense of organisational, professional and cultural development. This resulted, to a large extent, in the building blocks of integrated care being in place but very little in the way of transformational change being achieved.There seems to be more…I don’t really want to use the word ‘tick box’ but it’s just sort of like, ‘This is what we have to deliver. We’ve delivered it’ and not really looking at, ‘Well what difference does it make? What’s the point?’ It's just sort of like, ‘We’ve done this’. A great example of this is, ‘We have done twelve thousand care plans’. OK, so what? You know? Rapid Response don’t use them. GPs don’t use them. If you ask any of the patients, they wouldn’t know anything about them. In the MDTs I haven’t heard anything about people using the care plans. It’s more around activity and volumes and it’s that gloss isn’t it, ‘We’ve done all this’, but… (Middle manager, commissioner organisation)

The programme did not focus enough on the people at the heart of integration processes, specifically the development of trusting relationships, working practices and, ultimately, shared responsibility for the commissioning, management and delivery of integrated care.

Second, pockets of good practice were evident across WEL in relation to integrated working between health and social care professionals at the level of service delivery. There was however a lack of consistency around the form, function and organisation of integrated working. Localities had little success in scaling up small-scale examples of good practice.It’s a very hard to get a sense of whether or not, from an operational perspective, services are really performing as well as they could be and whether integration is actually happening. And I think the sense that I get is that it happens in pockets, but it’s not consistent. It’s based on personalities rather than a standardised process or system being in place. And I think that that makes it very vulnerable to falling apart or not being sustainable. (Senior manager, commissioner organisation)

Third, the relationships between the health professionals expected to deliver more integrated care were often suboptimal. These problems were exacerbated by a number of factors, including a lack of continuity in personnel with a high turnover of clinical and especially managerial staff, inadequate capacity to engage with new ways of thinking or new activities, a lack of understanding of the roles and responsibilities of colleagues in different parts of the health and social care system leading to unrealistic expectations of one another and technical difficulties such as contractual constraints to partnership working and problems with information exchange.

Finally, despite being clearly founded on the principles of person-centred care, there was inadequate engagement with and involvement of people and communities who used services in the development and implementation of the programme.

The approach challenged established mindsets and norms

The position of the researcher as someone embedded yet able to operate at a critical distance was judged by the participants to be useful. While achieving a balance in the ‘insider/outsider’ role was challenging, the ability of the researcher to build and maintain relationships allowed her to make a unique and practical influence on the programme:Your input at the [X] Board was really helpful today—I'm really keen that the issues raised are dealt with operationally through [X] and more strategically through the [X] work streams as appropriate. (Senior manager, provider organisation)Just wanted to say it's a really helpful report with loads of helpful insights … I am picking up the findings with [name] and [name] so we can take forward … (Middle manager, commissioner organisation)

On one occasion, the in-residence researcher experienced some difficulty when discussing emerging findings that were not wholly positive and were not what one senior stakeholder new to the programme wanted to hear. One particular intervention that challenged the individual concerned was labelled dismissively as ‘not useful’. Over the next few weeks outside formal meetings the researcher worked hard to build a working relationship with the individual concerned and at a subsequent team meeting the senior manager lauded the insight provided by the researcher.

Discussion

The importance of integrating care for patients is well recognised but the challenges of doing so are significant. Integrated care is a good example of a policy initiative characterised by enthusiastic system leaders, frustrated practitioners and largely unhelpful research evidence. This study explored the potential of a participatory evaluation to move both the practice and study of integrated care into a more positive place.

The evaluation demonstrated how using a participatory approach can add value to an integrated care programme in four ways: first, by engaging stakeholders in using established evidence and helping them to understand the benefits of rigorously evaluating their work; second, by encouraging new insights that stakeholders had not previously appreciated, such as a preoccupation with structures and the lack of attention paid to developing effective working relationships; third, by holding up a mirror to issues that were familiar to stakeholders but that they had not addressed, such as the impact of manager turnover on the commitment to the initiative of front-line staff; and fourth, by challenging established mindsets and norms of the programme leaders and front-line staff.

Some of these findings are local and practical in their orientation and our view is that participatory approaches to evaluation add great value to clinical and managerial practice. However, in developing a deeper understanding of working processes and context they also help to unblock an academic field, which has become increasingly frustrated and frustrating, a field where the evidence that exists is partial, insufficiently sophisticated and, even where useful, is often ignored.

Participatory research is commonly used in the fields of education and community development but with a few exceptions it has received a lukewarm reception in the health sector, where the more positivist concepts of objectivity and external validity trump more interpretivist approaches to the creation of new knowledge.37 Participatory models of evaluation should never replace established methodologies and there are a number of challenges that have been described elsewhere,6–8 including the risks of researchers being ‘captured’ by their practitioner colleagues, ethical issues and how best to develop a model supporting the professional development and advancement of researchers. These challenges will need to be addressed before mainstream research funders embrace participatory evaluation as a legitimate area for significant investment.

Despite these challenges, this study demonstrates how a participatory approach can add value to both academic study and practice. We believe that the learning from this study in the field of integrated care is transferable to other fields where research evidence is insufficiently developed and poorly mobilised. This is particularly the case in an environment in which policymakers and research funders expect research to have a significant and rapid impact on practice and in which universities are exploring ways of rising to this challenge.

Researchers have far greater potential to be useful to practitioners than they sometimes realise and practitioners have far more to contribute to and to learn from research than they appreciate. Participatory evaluation is a model that has the potential to realise this potential for both groups and for the benefit of those who use health services.

Acknowledgments

The authors are grateful to the staff involved in the Waltham Forest and East London integrated care pioneer programme for their collaboration in developing the methodology presented here and participating fully in the evaluation.

References

Footnotes

  • Twitter Follow Martin Marshall @MarshallProf

  • Funding This paper presents research funded and commissioned by the Waltham Forest and East London integrated care pioneer programme.

  • Competing interests MM is a general practitioner in Newham.

  • Ethics approval University College London Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.