CHAPTER ONE: INTRODUCTION
1.1 This research was commissioned by the Scottish Executive Health Department and Scottish Executive Office of the Chief Researcher. Whilst much of the data comes from a health and social care perspective, it is likely to be of interest to policy makers and practitioners in health, social care and other areas of public sector service delivery with an interest in integrating evidence into practice.
Background to the study
1.2 The specific aims of this research were to explore how the Scottish Executive Health Department might begin to develop communities of practice by identifying individuals or 'champions' within the NHSS who may be in a position to promote service development and redesign through the use of action research and applied research. This would be the first stage of a longer term process involving a larger number of people within the NHSS, academia and other agencies and organisations.
Evidence based policy making and practice in the NHSS
1.3 The National Health Service Scotland is facing an unprecedented era of reform and investment, and politicians and the communities they serve have greater expectations that this will result in improvements in the management and delivery of services. Since devolution major health policies have sought to address the barriers to improving Scotland's health and priorities for action have been clearly set out in a number of service strategies in areas such as cancer and diabetes, underpinned by a sound clinical evidence base. Less evidence is available to support clinicians and managers in the delivery of services through a series of different management and organisational structures which require new ways of working across professional and geographic boundaries. In addition, all service improvement in NHSS is underpinned by a commitment to joint working with local communities and partner organisations, particularly Local Authorities and the Voluntary Sector, and there is a need for a sound understanding of how this can best be achieved. 1
1.4 The concept of evidence based policy making is now a standard of best practice in Government. Increased public scrutiny and accountability has created a heightened awareness of the need for politicians and decision makers to have a range of evidence to support policy development and decision making in the public sector. Improving access to evidence and engaging with stakeholders is at the heart of this programme. 2
Aims and objectives of this research
1.5 The specific aims and objectives were to:
a) Identify key individuals in NHSS systems with an interest in action research and applied research
b) Design and co-ordinate an NHSS event to bring together interested parties
c) Assess the level of preparedness in the academic community in Scotland for joint working with NHSS on action research and applied research
d) Identify specific opportunities for possible areas of inquiry
e) Provide accessible action research summary briefing notes or short 'how to' guides with signposts to further information and sources of expertise
1.6 A series of meetings and key stakeholder interviews were conducted with key individuals within the NHSS and academia with an interest in action research and applied research. These were identified by SEHD and by subsequent suggestions made by initial contacts and are listed in Annex 1.
1.7 Focus groups were also conducted with a total of 23 clinical and managerial staff from two Health Boards. These provided examples of the everyday opportunities and difficulties of using evidence in clinical and managerial practice in NHSS settings.
1.8 A total of eight case studies were complied to illustrate different action research and applied research approaches. These draw on both published material and unpublished accounts gathered by issuing a call for examples through action research networks and suggestions made by interviewees. These case studies were used in the focus groups with clinicians and managers as appropriate to illustrate the range of approaches that could be adopted in service delivery settings. They also serve as broader useful illustrations or guides to action research and applied research for a wider audience and are reproduced in full in the Annexes.
1.9 This scoping study is intended to be practical and illustrative and is not based on a full literature review, however where possible existing literature has been drawn on and some references are provided.
Structure of this report
1.10 Chapter two explores some understandings of action research. Chapter three discusses using evidence in practice; it draws on some literature and data from the focus group discussions and stakeholder interviews to articulate understandings of the nature and use of evidence and the failure of much evidence of good practice to become good practice. Chapter four examines the potential for the development of partnerships between practitioners, academics and other external consultants and examines the issues around the facilitation of the use of evidence, collaboration and the value of service users' perspectives. Chapter five identifies opportunities and barriers to using action research and applied research and discusses the development of communities of inquiry within communities of practice. Chapter six concludes by examining how evidence based practice could be supported through the wider use of action research and applied research, identifies key issues for future collaboration and the wider lessons for the improvement of public sector delivery.
CHAPTER TWO: KEY ISSUES IN ACTION RESEARCH FOR THE NHSS AND THE WIDER PUBLIC SECTOR
What is Action Research?
2.1 Action research is not itself a research methodology, but rather 'an orientation to inquiry'. There is no short, accepted definition of action research 3,4.
2.2 The term 'action research' covers a wide range of approaches including co-operative inquiry, participatory action research, action inquiry and appreciative inquiry. It is a practice for the systematic development of knowing rooted in experience which has the clear purpose of generating new forms of understanding, practical knowledge and skills to create knowledge amongst individuals and groups or communities of inquiry. A distinguishing feature of action research is that it is a form of self-reflective inquiry undertaken by participants to improve their own practices, understanding of those practices and of the wider context in which they are conducted.
2.3 Action research draws on many forms of evidence or 'ways of knowing'. It is possible to distinguish between four different ways of knowing, as outlined in Figure 2.1 5.
Figure 2.1: Ways of knowing
- Experiential - knowing through direct experience, empathy and resonance. In-depth knowing; often tacit; almost impossible to put into words.
- Presentational - the expression of experience in story, artistic and visual means.
- Propositional - 'know what' - knowing about something. This draws on concepts and ideas and includes the formal explicit, objective, 'scientific' evidence.
- Practical - 'know how' - having a skill, knack or competence.
2.4 Traditional positivist (focusing on 'observable phenomena') research tends to focus on the development of 'propositional knowledge' (see above table) in some fields such as engineering or medicine technical rationality is clearly appropriate, but in researching social interventions it tends to focus on what people say rather than what they do 6. If such research only generates propositional knowledge it will not easily translate into action or changes in behaviour which rely on tacit knowledge, feelings and acceptance and other forms of knowing often dismissed as subjective and experiential.
2.5 The emergence of action research can be seen as a response to failure of much research on good practice to translate into good practice and growing dissatisfaction with traditional approaches to research. These are too often concerned with completeness, precision and control; with maintaining distance and detachment to safeguard 'objectivity'. As a result, they are disengaged from the people and contexts which they seek to affect and produce complex theories that are difficult to reproduce in real-time situations where there are many complex, constantly changing variables.
2.6 The emphasis in action research is on collaboration between all those involved in the inquiry, so that the knowledge developed in the inquiry process is directly relevant to the issues being studied. Action research is conducted by, with and for people, rather than research on people. This challenges traditional research perspectives and roles. In action research, the practitioners or members of the community of inquiry become the researchers of their own practice, in order to improve what they are doing, in the midst of action, rather than long afterwards. The conventional role of the researcher becomes more like that of a mentor or facilitator, assisting participants to drive their own high quality inquiries in a systematic way.
2.7 Good action research will strive to stimulate and connect inquiry at three levels 7. First-person research practices, self-study or reflective practice addresses the ability of individuals to foster an inquiring approach to their own lives, to act with awareness and consciousness of choices, and to assess effects in the outside world while acting. Skills of first-person inquiry are integral to leadership development. Second-person action research/practices address the ability to inquire face-to-face with others into issues of mutual concern, usually in small groups. Third-person research/practice includes a range of practices which draw together the views of large groups of people and create a wider community of inquiry involving persons who cannot be known to each other face-to-face.
2.8 Action research typically involves groups of participants, co-researchers and co-subjects engaging in cycles of action and critical reflection. This basic process has been elaborated in different ways in different schools of practice. Different types of action research are presented as case studies in the Annexes.
2.9 There is a spectrum of approaches represented here, that cover a range of perspectives. They can be seen as being part of a continuum of approaches; at one end service redesign and improvement methodologies focus on building capacity to use evidence 8. This is in recognition of the deficiencies of the former emphasis on simply producing scientific or formal evidence and disseminating it to practitioners in the expectation that they will be able to access and use it appropriately. At the other end of the continuum, action research approaches focus on learning from practice, recognising and valuing diversity of perspectives and developing skills of reflective practice amongst practitioners.
2.10 There are a variety of action research approaches which focus on developing dialogue and reflective practice. This helps practitioners to notice the assumptions they are making and their intended and unintended affects. Action learning focuses more on making learning and skills development more deliberate by facilitating exploration of practitioners own values and behaviours 9. This approach has also been used to help practitioners to apply formal 'good practice' embodied in guidance and protocols to their own situation; the wider scope to use this approach to promote more active dissemination of formal evidence is evident.
2.11 Action research also recognises the value of other forms of evidence, from service users in particular 10. Cooperative inquiry could be used with service users or practitioners to drawn in their experience and wisdom 11. Action inquiry might be seen as a similar approach; here the example of action inquiry acknowledges that the evidence from literature has value when combined with practitioner wisdom and service user perspectives. Appreciative inquiry shows how a focus on what is working within a situation can generate new perspectives 12; like many of these approaches it is not mutually exclusive and could be used in conjunction with many of the other approaches to articulate 'stories' (descriptive, first-person accounts of 'what happened'), build connections and energy for change.
2.12 Systemic action research approaches focus on bringing in multiple perspectives from across service areas; making use of the wisdom and enthusiasm of those experiencing different parts of the system and exposing each of the parties to the real stories and experience of others 13. This illuminates interconnections, paradoxes and complexities and raises the prospect of system-wide transformational learning amongst individuals and the wider organisations and communities.
2.13 Each of these approaches has value in attempts to facilitate the use of evidence in practice and all have wider relevance for a number of other spheres of public service delivery which struggle to find the means to transfer evidence into practice.
CHAPTER THREE: USING EVIDENCE IN PRACTICE
3.1 All public services collect a mass of data-sometimes the things that are easiest to count and collect, rather than the things that really matter to service users and staff. Even if the data is accessible and comprehensible, it seems a long way from knowing about the evidence on an issue to doing things differently: to changing behaviour. Across the public services, information about good practice is often failing to become good practice.
3.2 This chapter outlines some of the debate about knowledge transfer and getting evidence into practice. It draws on thinking about knowledge management which is concerned with better use of the knowledge contained within organisations and then discusses the challenges of using evidence in practice drawing on the focus group discussions within the NHSS.
3.3 As part of this scoping study we discussed a range of issues about using evidence in practice with participants. This included: what we know about using evidence in practice settings; the kind of evidence that is available and valued; the scope for using evidence from patients and staff experience to affect service delivery; how the capacity to use evidence could be developed; and the need for space to learn and reflect on the real challenges to changing individual and organisational established practices and working cultures.
What do we know about moving evidence into practice?
3.4 In many areas of health and social policy, there is often a wealth of evidence about 'what works', some of it codified into guidance, procedures and protocols, but there remains the issue of the transfer of that knowledge into practice.
3.5 The gap between the rhetoric of evidence-based policy and practice and what happens on the ground in health and other social programmes and interventions has been acknowledged:
"At local level, where practitioners are under pressure to deliver tangible results, there are few opportunities to reflect on this gap, to have their own experience recognised or to contribute to the evidence base themselves. This can generate confusion, exasperation and cynicism"14.
3.6 Assumptions that 'good practice' is transferable to different contexts are common in many spheres of public service delivery, without regard for the facilitation of the transfer and application of that knowledge in context. Traditional research often generates recommendations, good practice guidance or protocols expected to guide practice. The problem of using evidence in practice tends to be viewed as a problem of dissemination: getting the evidence to the right people at the right time. In response to these concerns, many organisations are putting their efforts into being 'smarter' at dissemination. Implicit in this approach is the expectation the simple identification and dissemination of evidence of effective interventions would lead directly to improvements in practice.
3.7 Considerable efforts have been made to disseminate good practice to support the implementation of evidence based policy and practice. Whilst this is welcomed, increasingly, it is being recognised that devising better mechanisms for dissemination is having only limited success 15. In reviewing the experience of the health service Collaboratives, commentators have referred to 'a considerable naivety' around the issue of knowledge transfer and 'knowledge into practice' within health care organisations 16. Others suggest that whilst attention still needs to be paid to the systematic review and collation of evidence and dissemination of guidance, in order to redress the dominant focus on collating evidence, emphasis now needs to be placed on developing the capacity to deliver effective evidence-based practice and learning from practice 17.
3.8 Some approaches to knowledge management and organisational learning emphasise the need to engage knowledge users in the process of sharing and searching for knowledge 18. These have been characterised as knowledge-pull approaches. They recognise the importance of tacit knowledge and the need to see knowledge not as a thing or object to be identified, catalogued and transferred, where knowledge developed in one context is disseminated to and replicated in another. Rather knowledge is seen as a social process of 'knowing', brought to light and shared through interaction. The local use of knowledge or evidence is thus linked to its generation and reinvention, rather than replication, by those expected to use it. The emphasis is on the importance of developing an organisational culture that values all kinds of knowledge and recognises the importance of sharing it.
3.9 In contrast, knowledge-push approaches seek to increase the flow of knowledge and information within organisations. These tend to focus on capturing, codifying and transmitting explicit knowledge, often through the use of IT. The dissemination of codes of good practice, guidelines and protocols which formalise evidence about how to do something may be seen as such an approach.
3.10 Some of the case studies collated here (see appendices) discuss initiatives designed to facilitate the dissemination and use of data, to build capacity and develop networks through which to disseminate good practice, including Collaboratives.
What kind of evidence is available?
3.11 Amongst those interviewed for this research and more widely, there is recognition of the inadequacies of applying scientific evidence to practice settings. A recent paper from the Health Development Agency acknowledged that one of the biggest barriers to getting evidence into practice is the nature of scientific evidence itself 19.
3.12 The HDA acknowledge that very few studies reach an 'ideal' gold standard for evidence based on randomised controlled trials ( RCTs). The nature of scientific research designs means that critical variables, of vital interest to practice, are frequently excluded from consideration. This leads to a loss of both process information about actions, motives and behaviours of the people involved in an intervention, as well as the loss of information about the mediating effect of variables such as the local context and circumstances. Process and local context data are vital when considering the transferability of evidence from one setting to another and when trying to implement evidence in practice. Evidence-based medicine has sought to find the best and strongest evidence through synthesising primary research based on RCTs. There is an implicit assumption that the stronger the evidence, the more powerful its influence on practice will be.
3.13 The pre-eminence of RCTs has been a strong influence on ideas about standards of evidence although it has proved inappropriate to evaluate complex interventions in health care and wider public health and community based settings. For example, participants contributing to this study commented:
"There's a hierarchy of evidence; the RCT is still up there - even now. It's the quantitative versus qualitative debate. But it's about valuing both and not seeing one as better than another".
"It's very difficult for senior clinicians to accept. They want 100 consecutive cases. They're dyed in the wool researchers. It's P values or nothing! We do need to change that. That's right in certain circumstances - in certain parts of the organisation you do need to do that rigorous approach but not in all".
3.14 This view of evidence is deeply embedded in the health service culture which is dominated by the 'medical model'. Change would be a massive cultural task, but there are signs that some thinking is shifting, as noted by research participants:
"I think the culture's changing though. We've had lots of discussions about 'what does evidence mean?' and there's a much wider acceptance of evidence from a range of sources - even if that's just one patients' story. If it's just one person - we've got it wrong for that person, but it's being able to look at that in the light of all the other evidence, to see if there's actual change required or was that human error that caused that one patients' experience to be poor?"
3.15 The pre-eminence of quantitative data has implications for the capacity of staff to collect and interpret data and for the scope to use evidence as an improvement tool, which is discussed below. Clearly RCTs will continue to play a role; but there is also a need for different kinds of evidence through qualitative, applied and action research to support professional practice.
3.16 Research participants identified a lack of good evidence as one of their main problems. This suggests that there is still a need for systematic reviews, although in themselves they are not enough. Participants made a distinction between using the available evidence and being able to add to the evidence themselves. They identified there is a need for evidence of effectiveness. One example, provided by a research participant, showed how this had led to a decision not to reorganise services:
"A few years ago, mental health services were rushing out and developing dual diagnosis services all over the place. We did a study of the level of drug and alcohol abuse amongst our client and a detailed literature review of 'what works' and on the basis of that decided not to develop a dual diagnosis service. All the evidence said that they don't really work better than standard services and in some ways have disadvantages. So although it was fashionable to split services up, we didn't and we probably saved ourselves some money!"
3.17 Another observation in the study focus groups was that lack of evidence of effectiveness isn't the same as evidence of ineffectiveness:
"One of our difficulties is reticence to publish negative results. We need recognition for demonstrating that there isn't evidence that something works - you should get credit for that, not just for proving you've done something that worked. Credit for the proof, rather than the pudding".
3.18 Research participants also questioned the basis of so called 'good' or 'best' practice:
"Sometimes best practice is simply 'asserted best practice' rather than established best practice. We have to leave room for common sense and professional views".
"What is best practice? 'Best' might be limited to your knowledge - the best example in Scotland or in a similar service to yours. As a patient, if that 'best' is on the national average that's not what I want. I want a perfect service! I want every single clinician to aim for perfect every time. Not national average or best in Aberdeen or Glasgow!"
3.19 Other participants wanted to be able to go beyond using the formal evidence to leave room for creativity and innovation:
"If we only do stuff that's evidence based we will eventually grind to a halt. Nobody would do anything new".
3.20 The HDA and others have acknowledged that scientific evidence provides a framework of plausible interventions and that the likelihood of success depends on local mediating factors. The HDAEvidence into Practice programme is about developing a means of collecting data about the mediating factors at a local level. The adaptation of 'promising practices' to enhance performance across varied organisational contexts is one of the themes of the ESRC Advanced Institute of Management ( AIM) Research programme:
"A considerable inventory of apparently promising practices has been collected in the private and public sector of the UK. Some of that inventory is well founded upon evidence from rigorous research; other work is less well grounded. The potential value of this inventory is not realised because in part research is not reaching practising managers in ways that lead most effectively to adaptation and therefore changes in practice, but also in part because we do not understand enough about the specific conditions which either facilitate or disable the adoption of specific practices"20.
3.21 The transferability of evidence from one context to another is a key issue. Research participants highlighted some of the specific factors that influence the adoption of specific practices, including the nature of the local population, the scale of the problem and the perceived validity and applicability of evidence generated elsewhere. One commented:
"Some of this is about variability in the intervention - so many are people based and you can't standardise people or the way people practice things".
3.22 Participants raised the issue of the extent to which evidence is produced elsewhere is not always valued. Sometimes the evidence had to be produced locally to be convincing to practitioners. For example:
"I spent a year of my life proving something that I already knew! If you did a meta-analysis it would generally show similar prevalence. The reason I did it was that we had a firm belief that we were somehow different from the rest of the world because we feel as if we have a lot more drug users".
3.23 They also questioned the need to evaluate all interventions and raised the issue of the organisational evaluation capacity:
"If you have something - of which the efficacy is established - and you have evidence to believe that it's transferable - to what extent do you deploy resources in evaluating that? To what extent do you audit or check or make sure the process is working right because we can't evaluate everything? Often we can't evaluate anything because we are so lacking in the available skills, capacity and time".
3.24 The discussion of the issues surrounding the use of evidence illustrates the need for a more nuanced and broader understanding of what we mean by evidence. The terms knowledge, evidence, information, data are often used interchangeably and without regard for the finer distinctions and interpretations of what the terms mean to different parties. Greater clarity may assist in thinking through the issues and strategies for implementation and may help to understand the lessons of organisational quality and service improvement initiatives that have not always made the gains expected:
"On the spectrum of: data-information-knowledge-wisdom, Collaboratives are currently more about data and information than knowledge or wisdom. So much of what people know and feel remains locked up in their heads, and Collaboratives do little to liberate this….. Knowledge is the step beyond information; it is 'the capacity to act'….it is knowing what to do with the best practice you hear about and how to apply it to your local situation - know-how not just know-what"21.
Learning from patients and staff
3.25 The use of evidence must also include listening to and sharing the knowledge and wisdom of staff and patients. Research participants emphasised this issue and comments included:
"At the moment, I am seeing the health service from the side of a relative and I just see a catalogue of disasters! It is a disaster. And the system has just closed the gates. Nobody wants to talk to you. There's no joined up care - she's acquired every hospital acquired infection that you can get and it's still not working. That's what drives me! I'm gathering evidence in a personal, experiential way".
"[staff] have presented patients stories at board level and simple changes have happened, even down to buying thousands of pillows, because patients said there wasn't enough pillows. Simple things like that impact on practice".
"Stories were fed back to ward staff about the patient experience. It's really powerful as evidence. Because you can think that your practice is evidence based - but perception is everything".
"in [……..] there was an example of patients gathering evidence from other patients - perhaps getting different stories than a clinician would have got with people being able to share their experience. If you have motivated individuals within the communities then you should use them. You can then get more robust evidence - and it takes pressure off elements of the health service".
3.26 One participant quoted a recent example from a published paper on Multiple Sclerosis which resonated with them and which suggests that valuing the patient perspective will require a major shift in attitude amongst some clinicians:
"The paper was called 'Doctors and patients disagree' - The doctors thought that walking was the most important thing that worried the MS patients. But the actual thing that concerned them most was losing cognitive function and their fatigue. If doctors saw that their walking had deteriorated when the patient walked into the room, that's what they were focusing on and they weren't paying very much attention to the other aspects of their problems - because they know best!"
3.27 Valuing user perspectives is discussed more fully in Chapter Four.
Building capacity for using evidence in the NHSS
3.28 There is a need for capacity building work: a more active approach is needed to make evidence more accessible, contextualised and implementable. The case studies in the annexes provide a number of examples of 'capacity building' work including support for staff to facilitate the use, analysis and uptake of data in practice settings and the use of action learning sets. Exactly what capacity building might mean is worth further exploration.
3.29 One element of capacity building is certainly about being better at disseminating the evidence, making it more accessible and understandable. Participants in this study commented:
"What the practitioners are looking for is the research to be practice based, patient focused, easily accessible to them, and disseminated. There are people doing research but it's never disseminated. Dissemination to an academic is publication or speaking at conferences. But that's not dissemination to a practitioner because they can't understand the publication - what do these statistics mean? What are we meant to do with that? It's about disseminating it in language that will impact on their practice"
"Yes, but we sometimes fall into the trap of taking that evidence that's not understandable and putting it into a policy that's not any more understandable or user friendly than the article itself".
3.30 The value of incorporating the evidence into assessment tools was also recognised by participants:
"if there's an evidence base to what assessment you should be doing, the types of information you should be collecting and the interventions that you should be doing - if these prompts are within the records that staff are using everyday then they're more likely to use the evidence to underpin their practice, than it be sitting separate in a guideline or a protocol, which might be on the internet and the computer's being used by six different people…..so then they'll just do what they've always done".
3.31 Another element of capacity building identified by participants was the development of the skills to access and critique the formal evidence available. One commented:
"Skills are needed to critique the research that's out there. There were some guidelines we were using based on b-rated evidence. When you went and got the reference that the guideline was talking about and read the article there was no evidence! But it got into the guideline. People need the skills, knowledge and confidence to be able do this".
3.32 Participants also highlighted the importance of the development of skills and confidence in the validity of different approaches to collecting new evidence:
"Most of the health services have been trained that you need very lengthy audit tools to actually collect data and some kind of sophisticated SPSS and Minitab software to record it all. That's not true. You can use a piece of paper. Anybody can collect data".
"It's not about 3 month exercises; it's not about 100 consecutive patients either! You get so much data. If you're not doing it right for 3 patients - why test 100? You've already failed if you're not able to evidence your practice in the 3 patients you've missed already"
"It doesn't have to be randomised controlled trial evidence that absolutely proves to the ninth degree that this is the way forward. You can collect data about the experience of the patient on that journey. You can collect data about the patients on the ward on that day, using that service, that theatre on that day - and it will give you a picture that is really important improvement data. You can then sit down with the team and say 'well, this is what was happening on that day'. That's a very powerful message - for the senior clinicians too".
3.33 The discussion amongst research participants suggest that capacity building goes beyond being better at dissemination and skills development and includes issues of individual and organisational evaluation capacity referred to earlier.
Making space for learning and reflection
3.34 Health service practitioners, like many others, are working in a highly pressurised, target driven context, subject to constant change where the imperative is action, not reflection. They need recognition of the value of reflection and the time, space and resources to reflect on their practice, review and evaluate evidence and revise their actions. Respondents commented on the challenges of combining front line NHS work with research and reflection:
"The ability to think is often driven out. The space to think, to collect evidence and to evaluate things is very hard won".
"If you're engaged in any kind of service development from an operational point of view, one of the things you don't do is write it up. I'm horrified by the amount of stuff we've been involved in that we've never written up".
"We don't have the time to do that because we're always responding to other pressures".
"It would be quite difficult to say 'take a couple of days off and write up that stuff we did'. We're a hard pressed health service; you can't have folk messing about doing that kind of stuff. In some ways where it is practitioners that have done stuff it's the only way you're going to get it done - if this is seen as a reasonable part of your job as a practitioner".
3.35 However, some participants expressed a note of scepticism about this 'culture of busy-ness';
"It's an excuse. They all cluster back - there's a paper that talks about the nursing station being the island and the ward being just a sea of sharks. They spend more time rushing around than sitting down with individuals and spending time with them. The culture stops them taking time to reflect and think about the way that they're working".
3.36 A number of factors prompt review and reflection. Some participants suggested it can be a response to an incident or crisis identified by practitioners or managers:
"Say there's been an incident happened and they've all come together and looked at the evidence and changed it - there the practice changes long before the guidelines are finished, published and ready to go out - because it's been generated by the practitioners themselves".
"We find that in [operating] theatres there's a very hazardous environment and a lot of changes to what we do are risk management driven. Even some thing as simple as the way a scalpel is handled. It has changed radically in the past 5 years. Again it's been driven by the practitioner because it's been identified experientially - the way we were doing it before was causing injuries, so the practice has changed to try to minimise that as much as possible".
3.37 Others suggested that managerially driven service redesign initiatives and audit procedures were also often a prompt for a review of the use of evidence
"We've always known that malnutrition in hospitals has been a problem. Suddenly because we're about to be audited [we're looking at it]. Division-wide or nationally - being audited supports using evidence".
"When you have a major service redesign - that it the time when you have everybody looking at the patients' journey, the best evidence. It's something that's often enforced on you - it's not often a choice that you've made".
3.38 Another view was that the promise of additional resources offered an effective lever for evidence gathering:
"My experience has been about 'what's in it for me?' when thinking about change management. None of it was about the evidence. They were completely turned off to the fact that it was better for patients. At that point, it was 'could we get more clinics? What's in it for me?'"
3.39 However, participants also acknowledged that such activity can distort and detract from quality improvement initiatives:
"You will not get a GP [here] to look at osteoporosis because it brings no income to GPs. It's not part of the quality and outcomes framework".
"Isn't that the crux of the problem? We're spending money on issues that are not contributing to quality. So the quality indicators are not set up to do the right thing. The right thing is to improve care for the patients - not to pay GPs for all the things on the list that they think are a good idea to do because it brings in a big income".
3.40 Participants also identified academic study and leadership development programmes as drivers for more practitioner driven reflective practice:
"You might be asked to do a reflective piece of work or review a local protocol or something like that as an assignment and then you'll get one individual in that ward that's desperate to change something in their ward because they know it's far from perfect. It's about it being a bottom-up approach".
"We know that when senior charge nurses have been through a leadership programme one of their key responsibilities is about improving the quality of care and they seem to be much more focused on 'right, what is it we're actually needing to improve in my ward?' and starting to do audit and to look at where the improvements need to be made. They'll then go searching for the evidence to guide them in what they need to do. Before it would have been top-led - 'there's a set of guidelines- you're to implement them".
3.41 These examples suggest there is also a need to rethink what is meant by evaluation of practice. It is too often enforced for audit purposes and so is seen as a burden, hurriedly completed at the end of a project by someone else rather than an integral part of everyday practice. Two comments on this issue from participants were:
"A lot of the work I'm involved in is pilot work that's been funded externally. What often happens is that funding is made available, people are approached with a deadline of about two weeks to come up with something because we've got this chance to get loads of money, but nobody mentions evaluation and then it's something people have to do. It's that time to think thing again. It's not that those resources are wasted, but that they could be better used if there's more time to think and plan".
"People are scared of evaluation. They think that they need to have specialist research training - they don't think they have the capability and whilst that may be true for some people, it's not rocket science".
3.42 However, other participants suggested that service redesign initiatives and improvement methodologies can clearly play a positive part in formative evaluation, challenging assumptions and promoting reflection on practice;
"In the Intensive Care Unit there were lots of assumptions made by the team that the way they practiced was evidence based and that they were providing the best care according to the evidence. We were asked to test elements of that on a rapid cycle basis - that's based on hours and days, not weeks! We used PDSA cycles and what happened was that within 24-48 hours we were able to demonstrate to that team that they did not practice in accordance with the evidence - so they then focused on the tasks that they would have to undertake".
"It's facilitating people to get to that incongruence stage where they recognise that what they're doing is not linked to what the evidence says and the impact that that is having and through learning to take ownership of actually doing something about that".
"That's about unlearning. That's about getting to a point to say 'I've learned not to do that any more' - unlearning practices of the past, before you can take that next step. So you can learn how to learn".
3.43 Research participants suggested that recognition of that this 'incongruence' or 'unlearning', seems to be the crux of the issue of research-into-practice. The task is about how to help individuals realise for themselves how their attitudes, values and behaviours may be inhibiting change, rather than being told that they should adopt particular written guidelines or adhere to a code of practice, however well intentioned.
3.44 Amongst research participants there was some experience from elsewhere of the value of approaches, such as action learning, to transform practice. For example, one participant commented:
"A nurse in A & E was sitting listening to a story [in an action learning set] from a nurse in one of the medical wards and that changed the way she practiced in A & E. She realised that their target was [reducing] this 4 hour trolley wait and getting patients through. When she actually listened about how this impacted on the care of patients in the medical ward she reflected and it changed practice. That's such a powerful thing".
3.45 That individual realisation and responsibility is the key to information about good practice becoming good practice, by being recognised, contextualised and therefore implementable. But facilitating change in service delivery is not solely about individuals changing the way they behave; there is also an issue of the values of the organisation. Examples quoted by participants illustrated the importance of what is valued in the organisational culture and how that affects individual's behaviours and attitudes:
"She said… 'I know I'm a good A & E nurse because my colleagues tell me I'm a good A & E nurse - and the doctors tell me that I'm a good nurse. I'm well known. I can get a drunk out of here in 20 minutes. I just pull him off the trolley by his ear and take him out the door'. She was valued because she cleared the spaces. But it ain't good practice!"
"It's about having a range of values - yes, we are interested in improving the cleanliness of our ward. We are interested in getting better physical outcomes for patients, but we're also interested in them getting good psychological support while they're here and that means that nurses sit down and actually speak to people. If it's not valued - then nurses will be busy cleaning the ward and keeping everything pristine and giving IV drugs and making sure they get out within the 4 day time limit or whatever the target is, but if the other range of quality measures are not valued then they don't act in that way".
3.46 Some of the approaches in the case studies focus on developing this kind of transformational learning, rather than on more traditional 'research' skills such as data handling and interpretation. They facilitate articulation of values which underlie how people behave and this also seems to be a key distinction between some of the service redesign and improvement methodologies such as Collaboratives and action research, such as action learning sets and whole system approaches. They hold out the hope that, for example, the A & E nurse, who's good at dealing with drunks, might be enabled to discover for herself some other value in her work that also supports patient care.
CHAPTER FOUR: THE POTENTIAL FOR DEVELOPING PARTNERSHIPS WITH THE ACADEMIC COMMUNITY AND OTHERS
4.1 This chapter examines the issues around the facilitation of the use of evidence and collaboration between practitioners, academics and other external consultants and with service users. Partnership working is a key issue for the NHSS, and a key objective of this study was to assess the readiness of the academic community to engage in applied research and action research.
4.2 The facilitation of the effective uptake of research evidence into practice is a theme running through many of the case studies and discussion amongst participants. Other research suggests that successful implementation of research evidence in practice depends on the relationship between the evidence itself, the cultural and organisational context and facilitation 22. Facilitation may be the key variable in understanding the use of evidence: it is this, not the strength of the evidence, which affects implementation.
4.3 It may be helpful to make a distinction between facilitation as the brokerage of evidence and facilitation as supporting individual learning or mentoring. Brokerage would include the signposting, sourcing, interpretation, distillation, commentary on quality and dissemination of evidence; and perhaps also involving some capacity building in terms of support for the collection and analysis of new evidence. Mentoring would include support for individual learning and reflection through a range of approaches.
Facilitation as knowledge brokerage
4.4 Research participants identified a need for collation, coordination and mediation of the available evidence, and stated:
"Evidence needs coordination. It would be good to have an 'evidence portal' - some kind of screen or filter to help separate the wheat from the chaff. Any field will have rogue studies and part of the game is to find evidence that supports what you want to do"
"It's not enough to have the evidence and access to it - there needs to be a certain amount of mediation".
"We need somebody to source that 'best practice' for you. Your knowledge is going to be limited. Even a simple example often turns out to be huge minefield! The evidence is beginning to come from NICE but it's not impacted on practice. Sometimes the evidence isn't there".
4.5 Knowledge brokerage would provide a systematic, enabling approach to accessing available evidence, rather than making it another individual responsibility. Participants suggested that a more efficient way of doing this on a national level would be a great help, but that at an individual level, they wanted there to be recognition that the extent to which people are expected to use evidence may vary according to different roles. For some, it would be their job to manage the evidence base, acting as a broker, signposting others to sources, and perhaps providing a 'quality awareness' commentary on the sources.
Facilitation as supporting learning
4.6 Facilitation is also important to support individual learning and helps to develop ownership of the changes to practice and so makes them more sustainable, as well as enhancing the skills of participants.
4.7 Key to sustained and transformational learning is 'double-loop' or second order learning 23; this goes beyond single-loop learning where on discovering something doesn't work, individuals try to find more effective ways of achieving their goals. This is often insufficient for solving more intractable problems and may make the situation worse. Double-loop learning encourages a focus on problem setting as well as problem solving: here failure to achieve intended consequences would lead to reflection on the original frame of reference and setting of a different problem. The injunction to be 'scientific' or rational is equivalent to restricting attention to single-loop learning.
4.8 The way that action researchers engage with participants in the research process is essential to the development of double-loop learning. In action research, participants become researchers of their own practice and develop their own practice-based theory, based on testing out good practice within their particular context and refining it. The role of the facilitator is not to give advice: part of their role is to help participants become aware of their existing frames of reference and begin to 'unfreeze' or 'unlearn' them, learn new frames of reference and develop skills of reflection.
4.9 In this way, action research supports individuals to take responsibility for their own learning, to re-interpret evidence and previous experience, rather than just simply acquiring more new knowledge. The skills of listening and asking questions, as well as analytical skills, are important elements of a facilitative approach. By engaging in the reflective processes of action research, participants can begin to understand how to facilitate this process in others.
4.10 The use of 'stories' is also an important element of many of the case studies and research participant's examples. These are descriptive, first person accounts of something that involved the story-teller directly. When shared, they provide resonance, connection and insight into the realities of service delivery, particularly where they also involve service users. Sharing stories or 'critical incidents' through tools such as significant event analysis, action learning sets, appreciative inquiry or whole systems working can open-up inquiry and provide energy for collaboration and change.
4.11 It is not suggested that any single approach should be adopted, but that a greater range of approaches might be more effective in bringing about more transformational learning. One participant commented:
"There has to be arrange of implementation strategies - the PDSA cycle, action learning, and reflective practice - a whole range of stuff not just picking one methodology off the shelf and saying that'll work for every change we want to make. But we do need to change the culture of the organisation and the culture within wards even".
4.12 These approaches require facilitation in their own right, which raises questions about where these skills are available and the scope for collaboration with academics.
Developing collaboration with academics and other external facilitators
4.13 One of the objectives of this study was to assess the readiness of the academic community in Scotland to engage with health practitioners in using applied research and action research. Some of the case studies illustrate how academics and other external facilitators have been able to support the generation and use of evidence in practice through a variety of interventions and approaches.
4.14 The stakeholder interviews conducted for this research included a number of academics and others with an interest in developing better collaboration with the NHSS. Those interviewed were chosen purposively for their known interest in these debates and are quite varied in their experience, perspectives and core interests. Some have a substantive interest in the issue of collaboration; others have interests in health service management and organisational development; and others in how they can support learning and the development of reflective practice amongst health service practitioners.
4.15 The interviews with academics provided some valuable insights. One participant commented on the issue of collaboration itself, drawing on other areas of public sector delivery:
"In a Community Planning context, the need for capacity building for collaborative working was recognised by the Community Planning Task Force, but the interpretation of this has been to develop skills frameworks and standards, such as the Standards for Community Engagement".
4.16 Here, as in many other spheres, it was said that the response has been to develop more guidelines and protocols. The proceeding discussion illustrates well that this approach misses some very important understanding about knowledge transfer and the implementation of evidence in practice.
4.17 It was suggested by a research participant that what is needed is an understanding of the nature of collaboration. This would entail an opportunity to examine the theories and concepts of collaboration to explore the nature of the practice of collaboration, in order to be more effective in collaborative situations 24. Another participant was able to give an example of the idea that 'there's nothing as practical as a good theory 25':
"Academics are interested in theory and that's why they're useful! For example, a discussion of power relations using action learning in care homes helped the participants to see that actually the relatives are powerful in that context and to use that to bring about change".
4.18 Other participants recognised the value of greater linkages between academics, other external consultants and agencies with an interest in the implementation agenda and felt it would be helpful for this to be coordinated on a central basis:
"There is a need for clear terms of partnership with providers [Universities]. It would be better if we had a generic framework that would address all the issues up front".
4.19 The interviews highlighted some of the issues. One is the ability of the NHSS to articulate its' needs clearly to academics:
"[Academics] can have a role in simple interpretation - synthesising evidence, translating it, but the NHSS needs to articulate it's needs [to us] and needs mechanisms for identifying their needs".
4.20 In terms of what academics need from the NHSS a clear message from an academic participant was:
"Give us a clear steer, don't change your mind, give us access, be more relaxed about outcomes and allow us to publish".
4.21 Of course, academics are a diverse group, and some participants felt more constrained by the demands of the Research Assessment Exercise ( RAE) than others. One commented:
"Academics do need to start in the context of the RAE, but it may be less of a concern than the need for a clear steer from the NHSS. There's a hopeful sign in the use of 'esteem indicators' which give peer group recognition for other activities, advisory roles and so on, although Scotland is weak in opportunities for this kind of thing".
4.22 The demands of different institutions and disciplines may also be a barrier. There is a need for an interdisciplinary perspective, absent from much of publicly-funded research which focuses on the technological and scientific aspects at the expense of understanding the human dimensions of change. Participants suggested that working more closely with clinical teams and patients will bring additional concerns. One commented:
"There can be a conflict between what the University and the NHSS needs from research. Academics need high level support from within the University. Working with the NHSS means that collaboration takes longer and access and ethics issues need to be sorted out".
4.23 However, research participants also argued that initiatives from research funders such as the Advanced Institute of Management ( AIM) Research ESRC initiative mean that practitioner focused research should become a more attractive proposition.
4.24 There was also recognition of the challenge to traditional ideas about research through comments such as:
"Perhaps this is nothing to do with research as we know it? It's more a form of operational development which is the responsibility of managers".
4.25 In addition, some of the research participants did not share the basic epistemology of action research, based on understandings of the nature of knowing and what can be known 26. Some asserted that what they did was not action research. For example:
"I don't necessarily use the term action research. I'm more comfortable with evidence into practice notions".
4.26 Others worked in fields where action research is more accepted. They reported that they were able to publish in peer reviewed journals and felt very comfortable with a facilitative way of working. One commented:
"The principles of action research - that it is democratic, participatory, orientated to change and the development of theory are all part of critical social science and I'm very comfortable with that. Action research gives rich data; you can write abut process and outcomes as well as about reshaping theory".
4.27 These views raised an issue about the extent to which academics have the interest in or skills to facilitate action research, where part of the intention is to enhance the skills of practitioners as co-researchers. Whilst this is not just an issue of method, by way of example, traditional qualitative research interviewing may be able to challenge assumptions, beliefs and values by careful, probing questioning. However, the purpose of this approach is to gather 'rich' data rather than to help the person learn from the reflective experience in a deliberate way. An interviewer may not have the responsibility or intention to help the person learn from the situation and consider other options and actions, in contrast to an action learning facilitator, for example. The challenges of developing a facilitation style that promotes participation, deliberate learning and action are substantial for many academics schooled in traditional research approaches 27.
4.28 The case studies also show some examples of a more open, systemic approach to inquiry where the role of the facilitator is to provide a supportive framework for dialogue between different interests and to develop reflexivity, reflecting the view that:
"Traditional forms of research and policy debate fail to tap into possibilities for generating new forms of reflexive dialogue and learning at and beyond the policy-practice interface. Here we are exploring notions of systemic and policy learning that privilege learning not about one single element, but rather from mutually influencing relationships"28.
4.29 The case studies show that part of the facilitation task is to provide better opportunities to integrate multiple perspectives from staff at all levels and from patients and service users; and or drawing on and sharing the experience and wisdom in order to make this form of tacit knowledge explicit. The evaluation of the health service Collaboratives remarked on the very positive reaction of participants in the Collaboratives to the involvement of patients in their quality improvement work, and reported:
"User involvement has been a particular strength of the Mental Health and Cancer Services Collaboratives, with many participants commenting that this had challenged assumptions and led to new insights…….it seemed that the views of patients - and junior staff - were in the event as or more powerful than expert 'evidence'"29.
4.30 This suggests that incorporating user perspectives is crucial to service quality improvement.
Valuing user perspectives
4.31 Research participants also recognised the value of user perspectives. One health board had recently undergone a major consultation exercise, and a participant remarked:
"The amount of consultation we did - particularly with the public, patient representatives, former patients, community representatives - we did get some quite heated discussions about how they felt they were being cared for. I felt that was a very powerful way that we approached it. It was also quite difficult sometimes because of the comments. I did feel it was a genuine attempt to hear the voice of the people that we serve. Some of the thinking of my colleagues did start to change about how they were delivering services and what needed to change - they'd never had that opportunity to hear that before... It did change practice and it did change mindsets. The trigger was the thinking started to become more open and people were in a position to start to talk about how we are going to do it. That powerful message coming from the people we serve- you couldn't ignore that".
4.32 This example brought benefits in terms of challenging professional perspectives and assumptions and so opening up possibilities for change, but also in bringing the ideas and perspectives of the public into the discussion of service redesign and deepening their understanding of how the health service works. The research participant continued:
"A number of members of the public said to me - 'this has been great. I now understand a bit more about what actually happens in the health service!' One of the biggest things I got were how informed the public were. Not just about their [medical] condition, but just on good ideas"
"It made the redesign of clinical services very focused on the experience the patient was having, rather than on the infrastructure that we current provide and deliver services through".
4.33 However, greater user involvement also raises tensions. There is the danger of tokenism by having public representatives on committees or groups and then ignoring them. The deployment of multiple perspectives will bring different kinds of evidence or ways of knowing into the debate. There is the potential for a clash between public perceptions of 'need' based on their experience, preferences and expectations, and professional views of: effectiveness, based on formal evidence; their perceptions of what constitutes valid evidence; and their own experience. Other research participants stressed the value of a dialogue between different perspectives, and commented:
"It's all to do with the human dimensions of change. It's about having that dialogue - some thinking and debating time. You may not all agree, but you may agree to disagree, but come up with something that meets most of the needs of that community".
"There's incredible value in getting the community into the room and saying 'right what do we do about this?"
4.34 The effort involved in major consultation exercises with the public also raised the issue amongst research participants of ensuring the fullest integration of these perspectives into wider work and being able to learn from what has happened in the past. One research participant comment on the apparent lack of 'corporate memory':
"My experience of the organisation in changes that have happened in the past or are happening currently is that we don't integrate it into what we're doing now. We don't learn from the cycles that we've been through before, so that instead of going round in circles, it's a spiral, we're progressing".
4.35 Research participants also expressed a wish to re-engage with past perspectives, access corporate memory and build on the momentum from public involvement and earlier change efforts to underpin and legitimise current developments. One said:
"There's a frustration knowing that things are going on that are not being successfully 'remembered' - we need a kind of organisational memory. Stuff just vanishes - I call it organisational dementia".
4.36 This links to the capacity issues and the legitimacy of reflection amongst practitioners. It also highlighted a need to be more systematic about recording what does go on anyway. One research participant was concerned that the health system took no interest in 'what happened next':
"If we looked across the organisation I'm sure there are things that people have taken it upon themselves to do [following the consultation]. We didn't track it. Some of the changes will be about how people behave. We also didn't have a good baseline of what was going on before".
4.37 Research participants also acknowledged the need to feedback to the public and stressed the importance of communication with the community. For example:
"There are a few things going on in some areas. There's a big opportunity on the public involvement side - we're missing a trick if we're not feeding it back to the community".
4.38 Feedback loops between clinical and organisational development teams are also crucial; research participants said they are often unaware of what each has done. They identified a reluctance to 'label' or take credit for service developments.
4.39 The case studies (see appendices) highlight a number of ways in which user perspectives might be integrated into research. For example:
- The cooperative inquiry group was comprised solely of diabetic service users. This process demonstrated the value of their experience of services, developed their own skills and gave them greater confidence to engage with professionals through membership of planning groups and other representative positions. This approach also allowed the group to develop greater awareness of service provision issues. The health service also learnt about how to listen to the lived experience of patients and feedback from the group helped to set service standards and good practice guidelines.
- The views of foster carers and adopters of the family assessment process were integrated into another approach through the use of focus groups co-facilitated by a researcher and a practitioner involved in the wider action research project. This aspect of the project design was particularly valued by the professionals.
- The whole systems approach brought professionals and a larger number of service users together to look at the realities of hospital discharge. This demonstrated the central contribution that can be made by service users and their carers as well as by professional staff.
CHAPTER FIVE: OPPORTUNITIES AND BARRIERS
5.1 This chapter summarises the main themes from the earlier discussion and outlines a number of opportunities and barriers for the creation of communities of inquiry in communities of practice within the NHSS.
Creating communities of inquiry within communities of practice
5.2 The purpose of this research was to begin to explore how the Scottish Executive Health Department might begin to develop communities of practice within the NHSS. Communities of practice are professionals who share a common language of practice; this includes their values, knowledge, terminology and procedures acquired through education and experience. They have been regarded as communities where:
"People share their experiences and knowledge in free-flowing creative ways so as to foster new approaches to problem solving and improvement, help drive strategy, transfer best practice, develop professional skills and help companies recruit and retain staff30".
5.3 Communities of practice are central to understanding ways in which practitioners make sense of formal codified knowledge, guidelines and protocols and their own tacit knowledge drawn from their practice. The focus moves from:
"…exploring an individual's knowledge as an asset to be potentially transferred, to exploring collective knowledge, which is situated and context specific. In a community of practice, knowledge is constructed as individuals share ideas through collaborative mechanisms such as narration and joint work"31.
5.4 There are clearly a number of individuals, both clinicians and managers, within the NHSS with an interest in applied research and action research as a means to support evidence based practice. There are also a number of academics and other interested individuals with an interest in greater joint working and with the skills to add to the evidence base itself and build the capacity of practitioners to use evidence. The case studies suggest that there is a wider pool of those able to facilitate action research inquiries of various kinds. There will be many specific opportunities for inquiry, but their identification should be a local, collaborative matter, dependent on the particular priorities and interests within each context.
5.5 It is worth reflecting on what factors might promote the development of a community of practice. Many of the service redesign and improvement methodologies have sought to develop learning networks, rather than communities of practice. The evaluation of the Collaboratives suggests that:
"….knowledge dissemination and transferability only occur when there is a collective identity and the existence of a wider social network, neither of which seem to be fully present in the NHS collaboratives we have studied32".
5.6 Many service improvement initiatives are time-limited projects which focus on information sharing and the replication of best practice. What is needed is the reinvention and the local customisation of quality improvement approaches, to increase the absorptive capacity or receptivity within organisations to facilitate the integration and use of evidence in practice. The development communities of practice, and specifically, of communities of inquiry within communities of practice, is an approach that can be used to foster this customisation and local collaborative working for quality improvement in public services.
5.7 A community of inquiry within a community of practice would use action research to break down the division between those who produce knowledge and evidence (researchers) and those who use it (practitioners). Both parties would redefine their roles and develop a set of common values, norms, terminology and procedures. Communities of inquiry may be fostered by providing opportunities for real face to face working, extended social contact, joint learning sessions, use of what action researchers refer to as 'stories' and 'storytelling' and other informal, creative opportunities for exchange and co-creation of knowledge by working together 33. By identifying and enabling communities of inquiry within communities of practice to develop, greater knowledge transfer to other settings can be facilitated.
5.8 Table 5.1 below summarises much of this discussion from this research and frames it in relation to development of communities of inquiry. It sets out a number of opportunities and barriers to the development of communities of inquiry within communities of practice and factors which enhance opportunities for exchange and sharing within and outwith the immediate organisational context. It identifies these opportunities and barriers from both practitioners and service users' perspectives and from the nature of the task that is the focus of inquiry and the organisational and wider environment.
Table 5.1: Creating communities of inquiry in communities of practice34
Opportunities to reflect on action and act on reflection are enhanced by…..
Barriers to reflect on action and act on reflection are presented by…..
Opportunities for exchange and sharing are enhanced by……
Practitioners and action researchers
Openness to challenge and having professional assumptions questioned
Getting out more - seeing what goes on elsewhere
Sufficient staff autonomy and freedom to take initiative
Individual commitment to professional development and new skills
Explicit, conscious practice of those values espoused by the organisation
Awareness of personal and professional frames of reference that inhibit the capacity to act in ways which are more congruent with expressed purpose and actions
Ability to validate the significance of individual cases/stories
Openness to intended and unintended consequences of actions
Ability to decide on key methods and targets that suit local context and priorities
Restraints of roles, divisions of labour, job descriptions
Lack of ability and skills to reflect/act critically
Focus on delivery as the imperative
Continued adherence to quantitative methodologies in all areas of clinical and organisational practice
Preference and history of seeking patterns rather than puzzling
Cynicism and lack of buy-in to wider goals
Undermining of commitment to change by failure of organisation to practice the values it espouses
Ability to self-select rather than by management directive
Personal passion, commitment and energy to process of exchange and dialogue
Access to other people doing similar things in the same organisation and elsewhere
Openness of boundaries of network - not a closed group
Seeing diversity as a rich source of information for improvement
Skills and techniques to help recognition of tacit knowledge and to identify who else might be able to use it
Credit/reward for sharing knowledge and experience with others
Ability to pay dual attention to personal, local concerns and the bigger picture
Sufficient time for initiatives to run as long as energy and commitment is there - not constrained by other imperatives
Organisational and professional respect for experience of service users
Recognition of validity of qualitative evidence by all parties
Confidence that views are important, will be heard and acted on
Feedback on what happens to evidence provided by service users
Tokenism and undervaluing or rejection of perspectives
Entrenched professional agendas and power dynamics
Continued adherence to quantitative methodologies in all areas of clinical and organisational practice
Patient care and professional interests seen as in conflict
Openness and ability to listen by the organisation
Preparedness of professionals to give up some of their power in order to learn
problem that is the focus
High importance of the situation or task to those most centrally involved
Clear purpose for inquiry
Strong imperative for the issue to be resolved creatively
Solutions not necessarily clear cut - challenging, but not high risk
Access to formal evidence of effectiveness where desired
A focus on urgent tasks, rather than important ones
No imperative to act any differently
Lack of adequate skills, resources, time and tools
Inability to access or understand formal evidence
Ability to work collaboratively
Attention to process as well as outcomes
Ability to ask questions, rather than fix problem or give advice
Knowing what to do with the 'best' practice you hear about
Organisational and wider environment
Strong organisational expression of values which support learning
Strategic overview of the range of interventions to support learning from practice
Awareness of organisational frames of reference that inhibit capacities to act in ways more congruent with expressed purpose and actions
Permissive management culture - encouragement of reasonable risk taking
Modelling of reflective practice by managers
Legitimising feedback of all kinds - compliments, comments and complaints
Building on corporate existing knowledge
Working with the energy and commitment of staff
Encouragement of professional development
Nurturing of leadership skills at all levels of the organisation
Positive culture of evaluation
The 'blame' culture; lack of focus on learning from mistakes
Failure to credit successes
Pressure for short term solutions
Limited to reactive response to crises, audit and managerial-led service improvement initiatives
Organisational history and values that undermine quality improvements
Preference for forgetting - 'organisational dementia'
Lack of trust and poor relationships
Skills of facilitation and mentoring available
More strategic and proactive attitude to networking - brokering meetings, presentations or by hosting visits
Opportunities for extended social contact and face to face working
Informal creative opportunities for exchange
Development of systems thinking -encouragement to work across boundaries and outwith traditional silos
Multiple ways of communicating across boundaries and networks - greater 'interactivity' or 'two-wayness' between participants
Interest in locally contextualised and wider systemic learning
CHAPTER SIX: CONCLUSIONS: SUPPORTING EVIDENCE BASED PRACTICE THROUGH ACTION RESEARCH AND APPLIED RESEARCH
6.1 Whilst there is much evidence of relevance to practice in health and social care settings, there remain a number of issues about access to that evidence, questions about quality, issues about the kind of evidence that is considered to be valid and the remaining problem of the implementation of evidence in practice. Evidence about good practice is failing to become good practice.
Using evidence in practice: an overview
6.2 One perspective is that better dissemination of good practice is the solution; and the nature of the problem is primarily one of access to high quality, reliable evidence. This is certainly an issue, not least for those working in clinical settings. Evidence of effectiveness is necessary and practitioners have to be confident that they have access to the very highest quality evidence, rather than simply that to which they have easiest access. There remains a need for systematic reviews of evidence, but also better brokerage of evidence through signposting, sourcing, interpretation, distillation, commentary on quality and dissemination.
6.3 There is also a need for capacity building to make evidence more accessible, contextualized and implementable. This includes better understanding of the perceived validity and applicability of evidence generated elsewhere and of the local contextual factors that may affect the transferability of evidence. This includes the form in which evidence is disseminated. Incorporation into assessment tools, for example, helps to ensure implementation, rather than expecting staff to access web material or read and interpret complex scientific papers.
6.4 There is also a need to build skills in data handling, analysis and interpretation and confidence in the validity of different approaches to evidence gathering. This research has illustrated a number of examples of service redesign and improvement methodologies to promote better dissemination of evidence and build staff capacity.
6.5 A further perspective supports better dissemination, but sees the nature of the problem as being less about poor access to evidence and more about how people are able to learn from evidence. This recognises the inadequacies of applying scientific evidence to practice settings and that more systematic reviews, being better at dissemination and capacity building initiatives will not in themselves be enough to address the theory-practice divide. There is questioning of the nature of evidence, what is meant by 'best' or 'good' practice and a desire to understand the particular conditions that facilitate the adoption of specific practices within local contexts. This kind of 'practice' and locally contextualised data is absent from many, primarily quantitative, sources of evidence.
6.6 There is a need for a more nuanced and broader understanding of what we mean by evidence which goes beyond reliance on formal or propositional evidence. Many public service systems are 'data-rich, but knowledge poor'. Evidence should include the experience and wisdom of staff, patients and other service users. This presents many challenges to organisational and professional cultures, not least in the health services. Greater acceptance and understanding of the validity of qualitative research, particularly in generating data for improvement, is a first, essential step. Greater support for interdisciplinary and practitioner research amongst research funders would help to promote wider understanding of the human dimensions of change and begin to redress the focus on the technological and scientific evidence.
6.7 Related to rethinking evidence is the development of a different understanding of evaluation and to build individual and organisational evaluation capacity. Formative evaluation and self-evaluation which provides opportunities for learning and generating real-time feedback have many similarities to reflective practice and action research.
6.8 The development of reflective practice will require recognition of the value of reflection and the time, space and resources to reflect on practice, review and evaluate evidence and revise actions. Reflection on practice is often risk-management, audit or resource driven. These have their place, but seem more likely to lead to single-loop learning than transformative, enduring learning amongst front-line staff.
6.9 Enabling staff to recognise incongruencies in their own behaviour, and 'unlearn' elements of their practice seems to be at the crux of the research-into-practice dilemma. This is about how to help individuals realise for themselves how their attitudes, values and behaviours may be inhibiting change, rather than being told that they should adopt particular written guidelines or adhere to a code of practice. Individual realisation and responsibility and 'unlearning' of past practice is key to information about good practice becoming good practice, by being recognised, contextualised and therefore implementable.
6.10 Facilitating change in service delivery is not solely about individuals changing the way they behave-there is also an issue of the values of the organisation. Action research, which focuses on transformational learning (or learning how to learn), rather than on more traditional 'research' skills such as data handling and interpretation, can itself facilitate the articulation of values which underlie how people behave. This is a key distinction between some of the improvement methodologies such as Collaboratives and capacity building initiatives and action research, such as action learning sets and whole system approaches.
6.11 It is clear that evidence take-up needs facilitation-it is this, not the strength of the evidence that affects implementation. Facilitation operates at both organisational and individual or group level. A distinction is made here between facilitation as the brokerage of evidence and facilitation as supporting individual learning or mentoring. Mentoring would include support for individual learning and reflection through a range of action research approaches to bring about transformational learning.
6.12 Another part of the facilitation task is to provide better opportunities to integrate multiple perspectives from staff at all levels and from patients and service users, and to draw on and share the experience and wisdom in order to make these forms of tacit knowledge explicit. There is a need to recognise and value local, practitioner and service user wisdom and experience. Both the views of front-line staff and service users are too often overlooked. They present a significant organisational and professional cultural challenge in health as in other services. There is a need to foster a culture that recognises, values and shares the diversity of perspectives and the learning that they can unlock.
Is action research happening in the NHSS?
6.13 This scoping study shows that a number of approaches on the continuum of evidence-into-practice/action research are being implemented in the NHSS. There are a number of applied research service redesign and improvement methodologies which focus on building capacity, including the Collaboratives. There were some examples of interest in action learning in the focus groups; there were also a number of instances cited of small change initiatives. These were often isolated and not widely shared, even amongst close colleagues.
6.14 A key issue is that there is interest in different approaches to research that can be more immediate and practical. Some is happening anyway, but its legitimacy is not recognised. It can be dismissed as unscientific and anecdotal, and it is rarely recognised as action research. There is no overall framework to advocate and promote the value of the range of approaches identified here, which is a key requirement to foster recognition, legitimacy and learning from research.
Key issues for future collaboration
6.15 There is scope for collaboration with academics in a number of ways. They have a role in developing, critiquing and disseminating the formal evidence base. Some are already active in this brokerage role which involves signposting, sourcing, interpretation, distillation, and commentary on the quality and dissemination of evidence. It may also involve building local capacity for the collection and analysis of new evidence. This research suggests that there is a need for this kind of support to continue, and that there are academics in a position to adopt a greater knowledge broker role.
6.16 Whilst some might dismiss the role of academics in practice-settings, in fact, academics should be valued because of their interest in theory. They are good at the development of models and frameworks and these are often very practical. They could help practitioners to examine the formal evidence, for example, on the nature of collaboration or theories of power relations, in order to behave more strategically and effectively in collaborative and partnership contexts. There is also scope for reviews of the literature, distillations of evidence and reviews; this kind of material can be introduced into action learning sets or action inquiry groups, as illustrated in the case studies.
6.17 It is evident that many academics and the institutions they work for do not share the epistemological stance (theoretical position) of action research. They do not see what they do as action research and are more comfortable at the 'evidence into practice' end of the spectrum. Many do not work in the facilitative style of action researchers. Thus, there is limited capacity amongst academics in Scotland to provide mentoring support for individual learning and reflection. However, there are a few active action researchers within and outwith the academic community in Scotland and further afield who work explicitly to promote participation and enhance the skills of practitioners as co-researchers.
6.18 There is potential for the development of partnership working between NHSS practitioners, academics and others. Some are already engaged in this kind of joint working. Both the orientations towards evidence-into-practice and action research have value and fit into the perspective that a range of approaches is needed. There is still a role for traditional research and for building traditional research skills. However, action research does challenge conventional ideas of what is meant by research.
6.19 Collaboration between the NHSS and academics and other facilitators will require a high level promotion of the legitimacy of this range of approaches, both within the Scottish Executive, the NHSS itself and within academia. It will rest on active brokerage of communities of inquiry within communities of practice across the health service and other agencies to make connections across interdisciplinary, organisational and geographical boundaries.
6.20 Communities of inquiry are not 'learning networks' in the accepted sense. Instead, they allow for the re-invention and local customisation of quality improvement approaches. They use action research to break down the division between those who produce evidence and those who use it. They flourish where there are opportunities for real face to face working, extended social contact, joint learning sessions and other informal, creative opportunities for exchange and co-creation of knowledge by working together.
The improvement of public sector delivery
6.21 The issue of using evidence in practice is not confined to the health sector. These debates cut across the public sector. Efforts to be better at dissemination, to distil and communicate best practice and to issue good practice guidance, standards and protocols are a feature of public policy in whatever sphere.
6.22 Whilst much of the data used here comes from a health and social care perspective, there is a strong message for policy makers and practitioners in other areas of public sector service delivery with an interest in integrating evidence into practice. Too often practitioners, policy makers and managers are unaware of developments in other sectors, yet many are struggling with the same dilemmas and looking for 'new ways of working' in order to generate systematic and sustainable change in a number of public service environments.
6.23 There is a spectrum of approaches represented here. They cover a range of perspectives and not all would be seen as necessarily 'action research'. There is scope for a range of approaches to be used and for adaptation of approaches to suit the circumstances and the type of evidence required.
6.24 The sharing of 'stories' is an important approach; they provide resonance, connection and insight into the realities of service delivery, particularly where they also involve service users. Sharing 'stories', narrative accounts or 'critical incidents' through tools such as significant event analysis, action learning sets, appreciative inquiry or whole systems working can open-up inquiry and provide energy for collaboration and change. These approaches offer the best chance for transformational learning and system wide sustainable change.
6.25 The development of communities of inquiry within communities of practice can enhance wider organisational learning. Action research can promote collaborative approaches to learning from practice to enhance organisational learning as well as individual learning, when part of a deliberate organisational strategy to value and share knowledge, experience and wisdom of all kinds across whole systems.
6.26 This study has provided a useful review of the opportunities offered by action research and applied research approaches to supporting evidence based practice with a view to improving public sector delivery.
6.27 A range of new ways of working have been identified, and there would appear to be distinct opportunities for National Health Service Scotland to develop communities of inquiry to share learn knowledge within health systems. Partnership and collaborative working between the policy community, practitioners and the academic community would appear to be a realistic prospect in Scotland.
6.28 Scottish Executive, National Health Service Scotland, and other public sector organisations will be invited to consider the implications of this study.