SEARCH

SEARCH BY CITATION

Keywords:

  • children;
  • complex needs;
  • continuing-care;
  • health systems;
  • implementation;
  • nurse;
  • policy;
  • realist;
  • theory-based evaluation

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

Aim

To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy.

Background

Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation.

Design

Research methodology – Evaluation using theory-based realist methods for policy implementation.

Methods

An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007–2008.

Results

Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled ‘real-time’ manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes.

Conclusions

Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations ‘naturally’ without facilitated implementation and local optimization.

Why is the evaluation needed?

  • Government health policy is frequently conveyed in a top-down approach from Government to Chief Executives to service managers to implement locally.
  • Policy implementation is known to vary considerably and policies frequently fail to bring about the desired improvements in patient outcomes.
  • Realist methods have not yet been adapted for complex policy implementation scenarios, but offer potential opportunities for policy implementation and best-practice optimization through co-productive identification of what works.

What are the three key findings?

  • The exemplar illustrates a significant advance in policy implementation and evaluation methodology and demonstrates the advantages of using a ‘realistic’ approach to optimize implementation.
  • Ten mechanisms explained implementation success and differences in outcomes between new and existing policy and processes.
  • Active facilitation strategies by Consultant Nurses and local optimization with practitioners are critical factors for successful policy implementation that enable best-practice benchmarking of ‘what works’ locally.

How should the findings be used to influence policy/practice/research/education?

  • A novel adapted realist policy implementation approach has advantages over top-down approaches and is generalizable to other complex policy implementations, especially where there is low-level or dispersed clinical expertise.
  • The evaluation provides a new and comprehensive picture of the effective components of a children's continuing-care framework that can be used to benchmark national and international practice.
  • The new decision-support tool incorporates additional domains than previously, which more appropriately captures children who are likely to have continuing-care needs.

Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

This methodological exemplar illustrates the first ‘real-time’ realistic implementation, optimization and evaluation of a complex Government nurse-led children's policy using principles of theory-based evaluation.

Background

Governments frequently apply a top-down approach to policy development with fairly rapid implementation to demonstrate to the electorate that election pledges are being kept. Top-down implementation is usually via wide dissemination of policy documents, guidelines and frameworks to local Chief Executives to implement in their organizations. Health policies are increasingly complex, involve multiple Government departments and are implemented into complex systems that include many agencies and organizations (Eccles et al. 2009).

Policy implementation can be unpredictable and fail to translate into better outcomes for patients (Gunn 1978, McLaughlan 1987, Eccles et al. 2009). In austere economic times, Governments have additional concerns that new policies may potentially cost more. The Government in Scotland, for example, underestimated the cost of implementing free social care (Bell & Bowes 2006).

Greater understanding is emerging about the importance of:

  • Programme theory (how the intervention is intended to work),
  • Complexity (multiple components of an intervention, intervention context and their inter-related mechanisms of action),
  • Context (where and how the intervention is implemented) and
  • Critical success factors that may create and sustain mechanisms to successful outcomes (Astbury & Leeuw 2010, Schultz & Kitson 2010, Greenhalgh et al. 2011, Storey 2011).

Although implementation science is evolving, there is an absence of policy implementation studies that take a theory-based ‘realist’ rather than ‘idealist’ approach to policy development, implementation and evaluation. We could not find any examples of theory-based realist approaches being adapted and applied in the messy and unpredictable world of ‘real-time’ policy development, implementation and evaluation. In this context, ‘real-time’ is defined as a concurrent and active developmental process to optimize implementation and outcomes as intended whilst the evaluation is ongoing.

Examples of realist syntheses include reviews to explain the role of Government policy in supporting nurse-led care in general practice (Hoare et al. 2011) and application of health inequalities evaluation frameworks (Davies & Sherriff 2011). Realist evaluation has been used to study implementation of complex interventions in various contexts, including protocol-based care (Rycroft-Malone et al. 2010), understanding the impact of a nutritional intervention during smoking cessation (Mackenzie et al. 2009) and recruitment and retention of overseas nurses in the United Kingdom (UK) National Health System (NHS) (O'Brien & Ackroyd 2012).

The UK Government cross-departmental policy, used as a methodological exemplar here, concerned children requiring access to continuing-care funding to meet their ongoing complex healthcare needs. Children with complex and continuing healthcare needs are a rapidly growing, but still relatively small population where there is generally low-level clinical expertise (Perrin 2002). For additional background context to the development and timeline of the Children's Continuing-care Framework and supporting tools, see Data S1. The implemented policy and tools can be downloaded from the link at Department of Health (DH) (2010a,b).

The study

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

Aim

To compare existing and new processes and outcomes by applying theory-based realist evaluation methods in a novel policy implementation context.

Objective of methodological exemplar

To show how adapted theory-based realist principles were used with diffusion of innovation theory in a new rapid policy implementation context.

Methodology

The evaluation focused primarily on the role and function of a decision-support tool that was developed to clarify if children with complex healthcare needs met the threshold for accessing continuing-care funding and secondly on the function and utility of a continuing-care pathway.

Theory-based evaluation and theoretical frameworks

In designing the evaluation, we used diffusion of innovation theory with Yin's multiple case study design (Yin 2009) and theory-based evaluation using adapted realist principles.

We used Rogers’ four stages of ‘diffusion of innovation’, namely invention, diffusion via communication through the social system, time and consequences as an a priori conceptual framework (Figures 1 & 2). We postulated that flow of information through networks in participating sites was critical and these networks would be unique in each site (Rogers 1962). Innovations that are perceived as relatively advantageous (over ideas or practices they supersede), compatible with existing values, beliefs and experiences, relatively easy to comprehend and adapt, observable or tangible and separable for trial are, according to Rogers (2003), adopted more rapidly. One of the unique aspects of this study is that we considered implementation issues impacting on these five intrinsic factors could be manipulated to some degree in real time to optimize adoption. In this context, Rogers (1962) sees diffusion and adoption as a five–step process involving decision-making. Diffusion occurs through a series of communication channels over a period of time among members of a similar social system. Figures 1 and 2 show Rogers' five steps: knowledge; persuasion; decision; implementation; and confirmation, mapped against our facilitation strategies to optimize implementation and adoption and Rogers' five intrinsic factors for acceptance or rejection (relative advantage, compatibility, complexity, trialability and observability (see also further description below on facilitation strategies).

image

Figure 1. Evaluation design.

Download figure to PowerPoint

image

Figure 2. Rogers' 5 stage model of diffusion and adoption.

Download figure to PowerPoint

Theory-based realist principles

Theory-based approaches, such as ‘realist’, aim to determine what works for whom, in what circumstances and why mechanisms work (or not) (Pawson et al. 2005). Realist methods involve making explicit the programme theory (or theories) about how an intervention (in this case a policy) is meant to work and its anticipated impacts when implemented. Researchers then seek out evidence to populate an a priori theoretical framework and modify the programme theories as supporting or conflicting evidence is interpreted and understood.

Using a ‘diffusion of innovations’ lens, we drew on methods of theory-based evaluation to help make sense of case study evaluation data and to ‘unpick’ and understand better the programme logic (sequence of inputs, activities, outputs and outcomes) in each clinical site and programme theory (explanatory account of how the programme works). We adopted both Pawson and Tilley (2007) and Weiss' (1997) similar positions on mechanisms as summarized by Astbury and Leeuw (2010) that ‘Programmes work (have successful ‘outcomes’) only insofar as they introduce appropriate ideas and opportunities (‘mechanisms’) to groups in the appropriate social and cultural conditions (‘contexts’). The mechanism of change is not the policy intervention – it is the response that activities generate'.

Policy development, implementation, optimization and evaluation

Using the diffusion of innovations framework (Figures 1 & 2), processes and chronological sequence of activities carried out in each of the 12 sites are described under the following four phases and shown diagrammatically in Figure 3.

image

Figure 3. Flow diagram through comparative and embedded case study in each site.

Download figure to PowerPoint

Invention phase: development of the national children's continuing-care framework and supporting tools (the policy)

A cross-Government expert group, including the evaluation team, nurses, commissioners and leading children's charities, was convened. Over a period of months, with periodic face-to-face meetings, the framework and philosophy were drafted and refined, a continuing-care pathway was adapted from an existing format (ACT 2004) and a decision-support tool was developed.

The underpinning policy philosophy (programme theory) was also designed to align with core principles and key priorities of the Government-led disabled children's review (Department for Children, Schools & Families, Department of Health Aiming High for Disabled Children 2007, DH 2007). The review focused on three priority areas to improve the lives of disabled children and their families: access and empowerment, responsive services and timely support and improving quality and capacity.

The policy aim was to set out Primary Care Organisation and Strategic Health Authority responsibilities and to assist them in applying a consistent and transparent approach to assessing the healthcare needs of children and young people and to work jointly with local authorities to provide services in light of those needs (DH 2010a,b).

Decision-support tool

The decision-support tool was adapted for children from the adult version (DH 2007) and was designed to assess children's complex healthcare needs across 10 care domains (programme theory and logic), including:

  • challenging behaviour
  • communication
  • mobility
  • nutrition
  • food and drink
  • continence and elimination
  • skin and tissue viability
  • breathing
  • drug therapies and medicines
  • psychological and emotional and seizures.

Each care domain had up to five levels of need, based on a mixture of complexity, intensity, unpredictability of need and risk to the child/young person (priority, severe, high, medium and low). Either three ‘high’, one ‘severe’ or one ‘priority’ level rating was conceptualized as the child being likely to have continuing-care needs.

Continuing-care pathway

The continuing-care pathway aimed (programme theory) to link children, young people and their families with community, hospital-based and local authority services and not-for-profit services to ensure a joined-up and integrated approach to meeting their continuing-care needs.

Phase 2. Diffusion via communication through the social system

Site recruitment

The Department of Health invited Primary Care Organisations to participate in policy implementation. Potential sites completed a structured demographic profile, which requested information about organization size and reach and services offered and other characteristics, such as local population descriptors (urban vs. rural). Thirteen sites responded and twelve sites took part representing twelve of thirteen Strategic Health Authorities in England – one site decided to participate in another evaluation (Table 1).

Table 1. Characteristics of the 12 pilot sites (B–M)Thumbnail image of

Implementation strategies

Legal status of the new framework and supporting tools

As the new policy had no legal status, practitioners were required to follow existing decision-making processes and record the outcome. The decision derived from existing processes was conveyed to the child and family and their associated multi-agency healthcare team. The child and family then followed the existing pathway through care. The hypothetical outcome from the new policy was NOT conveyed to the child or family.

Facilitation

Diffusion of innovation theory places the greatest emphasis on diffusion through communication in the network or system. We conceptualized each site as a system and implementation leads (DW and KB) would play key roles as opinion leaders and communicators to enhance flow of information into and through each network in the system. We conceived communication as a two-way process with information giving and sharing, which could influence the likelihood that the framework and supporting tools would be adopted. DW and KB were uniquely placed as opinion leaders as they had been key members of the team to develop the supporting tools and both were working at Consultant Nurse level and practising in this field.

A novel aspect of the implementation strategy was the importance placed on active facilitation as a way of communicating with and supporting local nurses to think about local context and what needed to change in terms of culture, service design and processes and professional behaviours to optimize implementation of the supporting tools (diffusion of innovations). Local facilitators (usually key clinical nurses in positions of responsibility) were identified in each site and their employer was reimbursed for 1 day per week for 4 months.

Implementation leads DW and KB developed a practice-based teaching and implementation strategy and worked with local facilitators in each pilot site to implement the new framework, decision-support tool and care pathway (1·5 days per week, plus 1 day per week for administration for 7 months). In practice, this entailed developing a new set of ‘hypothetical’ nursing documentation and cognitive and practical processes, which were individually tailored for each site and incorporated the decision-support tool and care pathway.

Establishing a mini-community of clinical practice

We were also keen to elicit the tacit and experiential knowledge of practitioners who were involved in the evaluation and were tasked with local implementation in different types of services and settings across 12 sites. Whilst acknowledging that communities of clinical practice usually come together over longer periods to develop and share knowledge, a critical part of the implementation strategy was to try and create a similar ethos of knowledge exchange and partnership to ascertain practitioner experiential knowledge of implementing the framework and supporting tools derived from their practice (Gabbay & Le May 2011). To create what we coined as a ‘mini-community of clinical practice’ we organized a practitioner-led midway meeting to bring practitioners together for this purpose. We hoped that the community of practice would help modify opinion.

Phase three: time

The evaluation was scheduled to run for 3 staggered calendar months in each of the 12 sites with a 1 month lag for decisions to be finalized. Thereafter, if no decision was reached, the case was recorded as unresolved.

Data collection

The following data were collected during 2007–2008.

Baseline

Demographic profiles of sites

Structured demographic profiles submitted by sites as part of the application process were used as baseline contextual data.

Existing processes and practice

DB and KB conducted tape-recorded in-depth semi-structured individual and/or group interviews with key practitioners in each site to articulate existing site processes and pathways.

Comparing existing and new processes

Local practitioners collected anonymized information on a detailed questionnaire, which recorded processes and outcomes from using existing and new processes on all children referred for assessment or re-assessment during the 3 month pilot in each site. Thereafter, any children without decisions were considered unresolved.

No patient identifiable information was recorded. Questionnaires recorded child and family demographics in ranges, types of assessments undertaken and outcomes using existing and new processes. The final section documented time frames from referral to decision and complaints.

Convening a mid-pilot ‘mini-community of clinical practice’

Having gained some experience of implementing and using the new framework and supporting tools, we asked multi-agency practitioners from each site to share ideas about what was working well and what in their view needed to happen to facilitate the required changes in culture, behaviours and service models to better accommodate the new policy and decision-support tools (i.e. implementation). Practitioners were also asked to participate in small group work to share their views on the fitness for purpose and utility of the care pathway in their own settings to help inform further development. The continuing-care pathway was amended substantially to accommodate feedback.

End of evaluation interviews in each site

DW, KB and GB conducted tape-recorded in-depth semi-structured individual and/or group interviews with key multi-agency staff and decision makers at the end of active engagement with each site to ascertain their views and experiences.

Interviews with implementation leads at completion of the evaluation

DW and KB were interviewed at the end of 2008 to gain their overall perspectives and views on the evaluation approach generally and future development of the framework and supporting tools specifically.

Phase four: consequences

Each site was conceived as a ‘case’ and we used Yin's comparative case study design as a way of organizing and comparing data on existing and new children's continuing-care processes across sites. We also added an embedded ‘case’ as an additional unit of analysis in each site to look at outcomes for individual children being processed simultaneously, using existing and new processes.

To understand and make sense of the ‘consequences’, we managed data in the following ways:

  • Each site was given an alphabetical code B-M (Table 1).
  • Baseline assessment and funding procedures and processes in each site were collated (Table 1).
  • Questionnaire data for each child were anonymized and entered into SPSS and reported using descriptive statistics.
  • Interviews with practitioners were transcribed, anonymized and coded for key information. The predominantly deductive ‘Framework approach’ was used to categorize and synthesize themes (Ritchie & Spencer 1995).

Theory-based ‘realist’ analysis

With a diffusion of innovation lens, we used principles of theory-based ‘realistic’ evaluation (Pawson & Tilley 2007) to primarily create a ‘structure’ in the form of context, mechanism and outcome chains to enable us to undertake the stage of case study analysis that Yin describes as ‘pattern matching’. The aim was to create and refine an understanding of potential mechanisms (M) and outcomes (O) from three comparative processes in the context (C) of the entire dataset, each site and then for each individual child, as follows:

  • Existing and new care pathway processes
  • Existing and new assessment processes
  • Existing and new decision-making processes

This co-productive phase, like qualitative data analysis, is difficult to explain, but at the heart of this process, we juxtaposed evidence on programme theory and logic, emerging theory and propositions and looked in detail at different sources of case study evidence, supplemented with expert opinion and experiential knowledge of evaluation team members.

Ethical considerations

This study was classified as ‘practice development and service evaluation’ and did not involve patients, so was deemed by the National Research Ethics Service as not requiring review by a NHS ethics committee. The new policy Framework and decision-support tools were only used in an additional hypothetical capacity as a dual process alongside existing processes. Usual clinical care and decision-making was not affected in any way. Practitioners in participating sites were assured of anonymity in reports and publications, provided with information sheets and their written consent was sought prior to participating in ‘formal’ group or individual interviews. Data collection instruments and evaluation processes were designed to comply with data protection legislation. A group of multi-agency key professionals and stakeholders led by the Department of Health steered the evaluation.

Findings

We first report our experiences of adapting realist methodology to this novel context and then report evaluation findings.

Adapting theory-based evaluation using realist principles

We had to consider several modifications and adaptations to the general principles of theory-based evaluation to fit with the accelerated policy implementation and evaluation time frame. For example, there were only a couple of weeks from formal invitation to lead the evaluation to assemble an appropriate team, design the evaluation and be ready to collect data from sites that were already selected and ready to go. We had no opportunity to undertake a synthesis of literature to establish a set of propositions, or develop an a priori theory of ‘what works’ in this specific context of care mapped against diffusion of innovation theory. The programme theory and logic was ‘unpicked’ from policy documents that were not written with a theory-based evaluation in mind and contained fairly abstract statements of intent rather than a coherent theory or explicit logic (with the exception of the decision-support tool and care pathway, where the logic was explicit). We initially relied on our expert testimony and clinical experience to develop an understanding of ‘what works’ and to design and individually tailor facilitation strategies for each site. JN and ML had previously developed best-practice guidance on discharge management and community support for children requiring long-term ventilation, which contained a best-practice example as to what worked in one service with obtaining continuing-care funding (Noyes & Lewis 2005).

Outcome of the policy implementation, optimization and evaluation

A total of 137 children and young people across 12 sites were referred for assessment or re-assessment. We obtained comparable data on all children. Just over half were new referrals and just under half were reviews of existing cases.

Context

The different profiles of sites are described in Table 1. A review of existing pathways to access continuing-care provision revealed a mixed picture of practice complicated by recent integration of Primary Care Organisations and varied progress towards Children's Trust arrangements (a reorganization going on at the time). High-level expertise was concentrated in a few centres that generally worked in isolation from the other sites that were ‘working it out’ for themselves. For an additional more detailed account of the variation of context across sites, see Data S2. The level of multi-agency involvement varied from high to low across sites. The clarity and transparency of routes to continuing-care funding and funding for complex-care packages could also be ranked high to low. For example, some sites had clear procedures and processes in place with a panel that made funding decisions within a specified time frame, whereas some sites had no clear pathway to obtain funding and a range of different people to submit requests for a decision. Where pathways existed, they were often linked to available provision, resources or specific conditions. Technology-dependent children, rather than those with behavioural or mental health needs, were commonly supported by pathways to access continuing-care funding.

Some sites had highly developed children's community nursing teams with identified lead practitioners to undertake assessments. However, at the other end of the spectrum, there were sites with no designated children's community nursing team and assessments were undertaken by practitioners as an additional role in adult services with few resources, limited experience of the needs of children and insufficient time.

At baseline, sites were using a wide variety of assessment tools and documentation at different stages of development. Most were based on the adult continuing-care model and were designed for use by a single agency – health. Most processes predominantly focused on physical health assessment.

Mechanisms

Implementation ranged from successful (9 sites) to resistant (1 site), to incomplete (2 sites). We identified ten mechanisms that explained implementation success or failure from two perspectives (professional and service user). The mechanism being the response that the existing processes or the framework and supporting tools generated.

Mechanisms explaining implementation success and failure mapped against Roger's intrinsic factors

Mechanism 1. Relative Advantage – Nurses favoured new processes over existing

We deemed implementation to be successful in nine sites as nursing teams had developed, integrated, used and were more positive about new processes than old processes. Existing joined-up working between individuals in and across agencies appeared to facilitate the ease with which the decision-support tool and care pathway were used. Although the framework and supporting tools had no legal status, at the evaluation end, we observed sites that preferred new processes had already modified their existing procedures to incorporate elements of what they could see were working well into routine practice.

Mechanism 2. Compatibility/trialability – Implementation leads identified barriers to successful implementation and instigated additional interventions to resolve

For example, implementation leads observed that existing written assessments of children's needs that were key to decision-making varied considerably in quality and depth. Implementation leads provided additional targeted support on gathering good quality information from all agencies to make assessment meaningful and useful when used with the decision-support tool (new process).

Mechanism 3. Compatibility/observability – Funders liked having a clear rationale for funding

In 69 children (50%), the decision-support tool was deemed specifically helpful for funding panel members and commissioners of services (i.e. the decision-makers), because it clarified the rationale for a child meeting funding criteria.

Mechanism 4. Compatibility/Trialability – Practitioners recognized that the programme logic was not a good fit locally and knew how to fix it

The mini-community of practitioners reported that in ‘reality’ the new care pathway did not ‘fit’ entirely and the logic needed significant further adaptation for optimal implementation. Adaptations requested are summarized in an additional file (see Data S3). Further refinement was undertaken in ‘real-time’ consultation and an adapted pathway was re-implemented in sites during the pilot.

Mechanism 5. Trialability/Observability – Practitioners were generally positive about working with implementation leads and local facilitators

Practitioners liked providing feedback on how to improve the Framework and supporting tools and were positive about the facilitated input they received to develop a set of local hypothetical documentation and procedures for comparison with existing procedures. Optimization was mostly undertaken in ‘real-time’ with re-implementation to see if modifications worked better.

Mechanisms explaining incomplete or failed implementation

Mechanism 6. Relative Advantage – Preference for existing processes

The one ‘resistant’ site preferred their existing processes that set out an entitlement to care and services that could be provided from existing services.

Mechanism 7. Insufficient existing infrastructure (outwith Rogers' model)

Nurses in two ‘incomplete’ sites had insufficient capacity or insufficient elements of a ‘process’ or existing infrastructure in the service for them to make any real difference to outcomes by implementing the framework and supporting tools. This situation mirrors Gunn's (1978) assessment (Figure 4) as to why policies do not get implemented. The decision-support tool was described as a ‘good start’ where nothing had previously existed and stimulated some useful discussions. However, the added utility of the decision-support tool was not evident and success of the process appeared to be dependent on the ability of staff to mitigate major deficits in service delivery and organization. We identified points in the process where additional interventions were required (service reorganization and investment) so that implementation could proceed, but implementation remained incomplete at evaluation end.

image

Figure 4. Gunn's ten reasons why policy implementation fails.

Download figure to PowerPoint

Mechanism 8. Compatibility – Lack of shared expectations

A lack of clarity of processes or systems led to both parental and/or professional expectations being out of alignment with assessed needs and availability of resources locally. There were 11 incidences of complaints and appeals using existing processes, one of which underwent judicial review. Five complaints remained unresolved at the pilot end.

Mechanism 9. Complexity – Inconsistent application of criteria

Our observations helped explain the so called ‘postcode lottery’ experienced by parents when attempting to access appropriate continuing-care packages for their children. Of the 137 cases referred for assessment, the decision-support tool indicated that 73 (53%) were considered for continuing-care funding with an additional 12 (9%) channelled for fast tracking to continuing care. Forty-two (31%) cases, although initially referred for assessment for continuing-care, were channelled by the decision-support tool to specialist and universal services. In 8 (6%) cases, there was no indication how the decision-support tool channelled the child. In 91 cases (66%), the outcome of using the decision-support tool matched the existing process and in 32 cases (25%), it did not. Reasons why the decision did not match varied widely and largely stemmed from human factors and behaviours, which are now better understood and can either be accommodated or addressed.

Mechanism 10. Compatibility/Trialability – Decision-support tool required further adaptation before it could be applied consistently

In 94 children (69%), the decision-support tool was easy to apply, particularly in relation to physical health needs. The decision-support tool was less easy to use with 29 (21%) children. Some of the decision-support tool domains did not capture fully the developmental or specific needs of children, which limited its value. These were behaviour, psychological and emotional, continence, breathing, nutrition and skin and tissue viability.

In addition, some areas of need were identified as missing from domains that could usefully be incorporated. These included equipment needs, training for parents and carers needs, parental sleep deprivation, parenting ability, school attendance and fluctuations in the child's condition. The decision-support tool was subsequently modified.

Outcomes

Following significant revision (optimization), the decision-support tool and care pathway had a more realistic ‘fit’ with what nurses considered would work (trialability/compatibility). The new supporting tools were deemed to add value to decision-making when combined with a robust assessment process – but professional judgements were still required. The framework and supporting tools did not resolve relationships between agencies, where lack of agreement and joint decision-making were historically an issue.

We observed least impact in the most developed and least developed sites as processes were either already working well and matched intentions of the framework and supporting tools, or not working at all well and introduction of new processes made no difference due to existing system deficiencies. The most impact was seen in sites that had developed some processes, but saw a clear benefit (high relative advantage) from implementing the framework and supporting tools – especially when further optimization had been undertaken during the pilot.

We reported reasons as to why assessments on 32 children using the decision-support tool did not match the outcome from existing processes (complexity). We had a clear picture about diagnoses of children presenting for assessment across 12 pilot sites and could map which children – especially those with behavioural problems – would potentially meet the ‘new’ criteria for continuing-care funding (and therefore increase costs) if the decision-support tool was modified according to feedback from sites. This information contributed to an additional policy impact analysis undertaken by the DH (2010b).

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

Significance, validity, rigour and generalizability

This large-scale policy implementation is significant in the following ways. The evaluation illustrates the first application of real-time, adapted theory-based evaluation methods and proactive facilitated implementation strategies and co-production of findings. This unique process enabled substantial changes (optimization) to be made during implementation through a two-way ‘diffusion of innovations’ process. The most significant factor was using Consultant Nurse-led facilitative implementation strategies that enabled Rogers' intrinsic factors to be addressed in favour of adoption and manipulation (optimization) of substantial elements of the programme logic and local context in sites to ‘best-fit’ our emerging and evolving theories of what worked for whom in which contexts (Ross-Adjie et al. 2012). A mini-community of practice and expert experiential opinion was used to change and manipulate the decision-support tools and care pathway during implementation so that the tools more ‘realistically’ aligned with experiences of what worked for those implementing the new policy locally. Implementation leads and local facilitators also worked as opinion leaders to modify elements of the context to lessen identified obstacles to implementation (compatibility), such as increasing the skill set of local practitioners and developing individually tailored local processes and documentation. As a broader marker of generalizability, recent published protocols are taking a similar approach to understanding the complexity of innovation adoption at different levels (Kyratsis et al. 2012).

The ability to address Rogers' specific intrinsic factors for adoption by optimization and re-implementation in ‘real-time’ was important. Although the new framework was principally drafted by the Department of Health and a wider advisory group (an example of an authority-innovation decision) – the supporting tools were developed by a clinically led expert group of practitioners who managed services for children with complex needs (authority/optional-innovation decision). When implemented in wider routine practice contexts across England, it became clear that further development of the supporting tools would strengthen general applicability of the overall policy to local contexts – therefore, collective-innovation decisions made by all individuals in the clinical network were vitally important.

We observed the importance of what Rogers defines as ‘collective-innovation decision-making’ or co-production of evaluation ‘findings’, which were more important as critical success factors than decisions made by those outside the social system in a position of power (authority-innovation decision-making). Implementation leads had high-level communication, managerial and context-specific clinical skills to negotiate their way into sites, ask the right questions and work effectively with local practitioners. Active facilitation strategies delivered by experienced Consultant-level nurses acting with local facilitators as key opinion leaders worked well in nine sites to support adoption of the intervention by increasing practitioner knowledge, persuading them of the benefits and enabling them to make the decision that advantages of using the framework and supporting tools outweighed disadvantages (Rogers' key stages of adoption) (Harvey et al. 2002). Implementation leads helped practitioners see the positive intrinsic characteristics, such as relative advantages, and helped increase compatibility with local development and tailoring and by doing so reduced the complexity of the framework and supporting tools (Rogers' key intrinsic factors when accepting or rejecting an innovation).

For the first time, using theory-based evaluation techniques, we were then able to map policy principles and statements of intent (programme theory) from two key policies (Children's Continuing-care Framework and Aiming High for Disabled Children) and demonstrate what successful implementation ‘looked like’ in practice. For a detailed description of evaluation findings mapped against key policy statements to show the component parts of an effective Children's Continuing-care Framework (i.e. our theory of what works for this specific group of highly complex and relatively small population), see Data S4. One site came closest and two others were close to matching the ideal (optimal post-hoc programme theory of what worked) and stood out as services where things were working well. As another indicator of generalizability – other sites now have potential to benchmark against this ‘best-practice’ standard and evolve their practice as appropriate to align more closely with what was seen to work for this group of highly complex children.

Limitations and strengths

In our view, it would have been helpful to gain better understanding of service users' perspectives and experiences during the evaluation rather than in the subsequent Department of Health-led public consultation (DH 2010a). For example, we found that only one service had parent-focused information on what to expect and no child-centred information was identified.

The evaluation was more expensive than a ‘top-down’ policy implementation. The main cost involved buying out nurse time to work as lead implementers or local facilitators in partnership with a Higher Education Institute. By the evaluation end, most local facilitators had developed an appropriate skill set that could be used for further dissemination and roll out across new localities using similar diffusion of innovation principles and an expanded community of practice. This approach to use and implementation of evidence can therefore be viewed as incorporating an added value innovative professional development component to ‘up-skill’ local practitioners.

Following Ministerial sign-off, the new policy and supporting tools were legalized and superseded existing NHS (England) processes. We next envisaged matching local facilitators from sites with successful implementation with new sites with a sufficient baseline infrastructure to share experiences of best practice and what worked. Sites with insufficient existing infrastructure required a different implementation strategy – including targeting the Chief Executive as a key change agent in the organization to put in place an infrastructure where the policy could be implemented. Future studies are then needed to establish whether organizational and behaviour changes have sustained implementation (Martin et al. 2011).

Conclusion

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

The exemplar illustrates a significant advance in policy implementation and evaluation methodology and demonstrates the advantages of using a ‘realistic’ approach to optimize implementation. Following optimization of the decision-support tool to include a greater emphasis on mental health and behavioural issues, the evaluation showed that the new policy was likely to cost more as more children met widened inclusion criteria.

Application of theory-based ‘realist’ methodology worked well, produced rich findings, up-skilled local staff and enabled significant policy optimization before Ministerial sign-off and universal roll out. Practitioners liked the freedom of evolving and optimizing a new policy that had no legal status, whilst continuing to use existing processes. We consider the approach to be generalizable to other complex policy implementations, especially where there is low-level or dispersed clinical expertise that is unlikely to diffuse or disseminate practice innovations ‘naturally’ without facilitated implementation interventions.

Acknowledgements

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

The evaluation team would like to express its gratitude to the project team and steering group led by the Children and Families Directorate at the Department of Health for their input, advice and guidance. The project team and steering Group members brought a range of experiences, knowledge and expertise. In particular, we are grateful to Amy Nicholas, Pat Nicholls, Jane Appleby, Gail Tremi, Katrina McNamara-Goodger, Lizzie Chambers, Alcuin Edwards, Paul Hughes, Carol Fickling, Kathryn Halford, Christine Lenehan, Angela Thompson, Nicola Watt, Heather Sahman, Sue Sylvester, Kathy McTaggart, Mark Whiting, Liz Morgan, Angie Glew, Fiona Smith and Sarah Louise Smith.

This evaluation relied almost entirely upon the enthusiasm and support of members of multi-agency teams who collected data, participated in interviews and provided documentary evidence, across twelve pilot sites in England. We are grateful for their significant contribution to the success of the evaluation. We thank Nyree Hulme for administrative support.

Funding

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

This study was funded by the Department of Health, England. The views and opinions expressed herein are those of the authors and do not necessarily reflect those of the Department of Health.

Conflict of interest

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

No conflict of interest was declared by the authors in relation to the study itself. Note that Jane Noyes is a JAN editor, but, in line with usual practice, this paper was subjected to double-blind peer review and was edited by another editor.

Author contributions

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information

All authors have agreed on the final version and meet at least one of the following criteria [recommended by the ICMJE (http://www.icmje.org/ethical_1author.html)]:

  • substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data;
  • drafting the article or revising it critically for important intellectual content.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information
  • ACT (Now Together for Short Lives) (2004) Integrated Care Pathway for Children and Young People with Life-Threatening or Life-Limiting Conditions and their Families. ACT, Bristol.
  • Astbury B. & Leeuw F.L. (2010) Unpacking black boxes: mechanisms and theory building in evaluation. American Journal of Evaluation 31, 363368.
  • Bell D. & Bowes A. (2006) Financial Care Models in Scotland and the UK. Joseph Rowntree Foundation, York, UK. Retrieved from http://www.jrf.org.uk/publications/lessons-funding-long-term-care-scotland on 1 July 2012.
  • Davies J.K. & Sherriff N. (2011) The gradient in health inequalities among families and children: a review of evaluation frameworks. Health Policy 101(1), 110. doi: 10.1016/j.healthpol.2010.09.015.
  • Department for Children, Schools and Families, Department of Health Aiming High for Disabled Children (2007) Retrieved from http://www.education.gov.uk/childrenandyoungpeople/sen/ahdc/b0070490/aiming-high-for-disabled-children-ahdc on 1 July 2012.
  • Department of Health (2007) The National Framework for NHS Continuing Healthcare and NHS-funded Nursing Care, Revised 2009. Department of Health, London.
  • Department of Health (2010a) National Framework for Children and Young People's Continuing-care. Department of Health, London. Retrieved from http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_114784 on 1 July 2012.
  • Department of Health (2010b) Impact Assessment of the National Framework for Children and Young People's Continuing-care. Department of Health, London. Retrieved from http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_114784 on 1 July 2012.
  • Eccles M.P., Armstrong D., Baker R., Cleary K., Davies H., Davies S., Glasziou P., Ilott I., Kinmonth A.L., Leng G., Logan S., Marteau T., Michie S., Rogers H., Rycroft-Malone J. & Sibbald B. (2009) An implementation research agenda. Implement Science 7(4), 18.
  • Gabbay J. & Le May A. (2011) Practice-Based Evidence for Healthcare: Clinical Mindlines. Taylor and Francis, Routledge, Abingdon, UK, 143 pp.
  • Greenhalgh T., Russell J., Ashcroft R.E. & Parsons W. (2011) Why national eHealth programs need dead philosophers: Wittgensteinian reflections on policymakers' reluctance to learn from history. Milbank Quarterly 89(4), 533563. doi: 10.1111/j.1468-0009.2011.00642.x
  • Gunn L.A. (1978) Why is implementation so difficult? Management Services in Government 3, 169176.
  • Harvey G., Loftus-Hills A., Rycroft-Malone J., Titchen A., Kitson A., McCormack B. & Seers K. (2002) Getting evidence into practice: the role and function of facilitation. Journal of Advanced Nursing 37(6), 577588.
  • Hoare K.J., Mills J. & Francis K. (2011) The role of Government policy in supporting nurse-led care in general practice in the United Kingdom, New Zealand and Australia: an adapted realist review. Journal of Advanced Nursing 68(5), 963980.
  • Kyratsis Y., Ahmad R. & Holmes A.H. (2012) Making sense of evidence in management decisions: the role of research-based knowledge on innovation adoption and implementation in healthcare. Study Protocol. Implementation Science 7(1), 22.
  • Mackenzie M., Koshy P., Leslie W., Lean M. & Hankey C. (2009) Getting beyond outcomes: a realist approach to help understand the impact of a nutritional intervention during smoking cessation. European Journal of Clinical Nutrition 63(9), 11361142.
  • Martin G.P., Currie G., Finn R. & McDonald R. (2011) The medium term sustainability of organisational innovations in the national health service. Implementation Science 14(6), 19.
  • McLaughlan M.W. (1987) Learning from Experience: lessons from Policy Implementation. Educational Evaluation and Policy Analysis 9(2), 171178.
  • Noyes J. & Lewis M. (2005) Hospital to Home: Discharge Management for Children on Long-term Ventilation. Barnardos, Barkingside.
  • O'Brien T. & Ackroyd S. (2012) Understanding the recruitment and retention of overseas nurses: realist case study research in National Health Service Hospitals in the UK. Nursing Inquiry 19, 3950.
  • Pawson R. & Tilley N. (2007) Realistic Evaluation. Sage Publications, London, UK.
  • Pawson R., Greenhalgh T., Harvey G. & Walshe K.J. (2005) Realist review-a new method of systematic review designed for complex policy interventions. Health Services Research and Policy 10(Suppl 1), 2134.
  • Perrin J. (2002) Health services research for children with disabilities. The Milbank Quarterly 80(2), 303324.
  • Ritchie J. & Spencer L. (1995) Qualitative data analysis for applied policy research. In Analysing Qualitative Data (Bryman A. & Burgess R., eds), Routledge, London, pp. 173194.
  • Rogers E.M. (1962) Diffusion of Innovations. Free Press, Glencoe.
  • Rogers E.M. (2003) Diffusion of Innovations, 5th edn. Free Press, New York, NY.
  • Ross-Adjie G., McAllister H. & Bradshaw S. (2012) Graduated compression stockings for the prevention of postoperative venous thromboembolism in obstetric patients: a best-practice implementation project. International Journal of Evidence Based Healthcare 10(1), 7781.
  • Rycroft-Malone J., Fontenla M., Bick D. & Seers K. (2010) A realistic evaluation: the case of protocol-based care. Implementation Science 5, 38.
  • Schultz T.J. & Kitson A.L. (2010) Measuring the context of care in an Australian acute care hospital: a nurse survey. Implement Science 5, 60.
  • Storey J. (2011) Steering whilst rowing: governing and managing health services from the centre. Journal of Health Organizational Management 25(6), 625644.
  • Weiss C.H. (1997) Theory-Based Evaluation: Past, Present and Future. New Directions for Evaluation. No. 76. Jossey-Bass, San Francisco, CA.
  • Yin R.K. (2009) Case Study Research: Design and Methods, 4th edn. Sage Publications, Thousand Oaks, CA.

Supporting Information

  1. Top of page
  2. Abstract
  3. Introduction
  4. The study
  5. Discussion
  6. Conclusion
  7. Acknowledgements
  8. Funding
  9. Conflict of interest
  10. Author contributions
  11. References
  12. Supporting Information
FilenameFormatSizeDescription
jan12169-sup-0001-SupplementalFile1.docxWord document12KData S1. Context to case study of Children's continuing-care policy implementation.
jan12169-sup-0002-SupplementalFile2.docxWord document15KData S2. Detailed account of the variation of context across sites.
jan12169-sup-0003-SupplementalFile3.docxWord document13KData S3. Adaptations requested by practitioners to the continuing-care pathway.
jan12169-sup-0004-SupplementalFile4.docxWord document22KData S4. Component parts of an effective Children's Continuing-care Framework (our theory of what works for this specific group of highly complex and small population of children) mapped against core principles of Aiming High for Disabled Children and Children's Continuing-care Framework.

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.