Dr. Jonathan Sharples explores evidence based practice.
The headteacher of an inner–city primary school is stuck. She has just had a meeting with her senior management team to discuss how they can do more for their struggling readers. On the positive side, everyone has contributed really well and come up with some great ideas. Her deputy has suggested that they should provide one–to–one tutoring, but she cannot be sure that the expense is worth it. The literacy leader is certain he has heard of a scheme that recruits volunteers from the community to do the same thing – he is positive he had read it in a magazine somewhere. The Special Educational Needs coordinator thought it might be a problem with the way they are teaching all children to read, and maybe they should look for something that was more effective across the whole school. Now, to add to the confusion, a colleague from a neighbouring school is on the phone, telling her about a really exciting pilot project they are using, which uses a new computer programme to help those who are struggling.
These are the types of questions that are faced every day by schools and colleges across the country, whether they are choosing a new literacy programme, developing a behaviour management strategy, or deciding to introduce a new approach to social and emotional learning. In scenarios like these, research evidence still plays a relatively small part in informing professional decision making, with practitioner’s own experience, and that of colleagues, much more likely to influence day–to–day practice. A similar situation might apply to a police sergeant trying to decide on options in a domestic violence case, or a social worker faced with referring a looked after child. Inevitably, too many important decisions are made by best guesses and are overly influenced by politics, marketing, anecdotal evidence and tradition. This results in classic pendulum swings, where new ideas and practices are enthusiastically embraced, found wanting and abandoned, only to be rediscovered in cycles.
A new paper published recently by the Alliance for Useful Evidence, Evidence for the Frontline, explores what can be drawn from the advances in a range of fields to mobilise research knowledge more effectively across social policy and practice. I frame the issue by looking at the individual elements of an effective evidence ecosystem – production, synthesis, transformation and implementation – whilst at the same time considering what needs to be done to integrate these elements more coherently. As well as looking at gaps in current infrastructure, I also pick out some exciting new initiatives and ideas that can hopefully produce tangible benefits for professional practice.
What is evidence–informed practice?
When trying to clarify what we mean by evidence–based practice perhaps it is easier to start by saying what it isn’t. Evidence–based practice is not ‘cook book’ teaching or policing, nor should it be about prescribing what goes on from a position of unchallenged authority. It is about integrating professional expertise with the best external evidence from research to improve the quality of practice. It is important to remember that there is a huge amount of experiential knowledge that is not captured by research, and, therefore, that an absence of evidence certainly does not mean absence of effectiveness. Hence, whilst the term ‘evidence–based practice’ has historical relevance, perhaps ‘evidence–informed practice’ is a more appropriate term.
An important theme covered in the Evidence for the Frontline report is that the demand for evidence must come from a will to advance standards in practice, rather than being a research or policy–driven agenda. Across social policy and practice, research is too often seen as outside of professional practice; something that is done to practice; practice serving research, rather than the other way around. If we compare this to medicine we see that the communities involved in delivering frontline services are much more infused with a research–facing outlook, so that the people involved in training, research and practice are able to move more fluidly between these different roles.
It is these inherent gaps between research and practice across many of our public services that means mobilising knowledge is so challenging – the wider the gap is after all, the harder it is to bridge. As we discuss, efforts need to focus on ensuring these two worlds can operate with greater synergy and interaction. The ultimate goal should be straightforward: to empower professionals with evidence.
Recommendations from the Report
Researchers/Intermediaries
- Research and development (R&D) should be framed in terms of an ‘evidence pipeline’, which takes developers on a journey from promising innovations through to large scale proven models. This process should be underpinned by research methods that are relevant for the point of development and the resources available at that stage.
- Whilst more experimental trials (e.g. RCTs) should be welcomed, they should be seen as valuable tools within the developmental timeline of an intervention or strategy, rather than a research panacea.
- Schemes such as the ESRC’s Knowledge Exchange Opportunities should be expanded, enabling social science researchers to be embedded in frontline services. Likewise, opportunities for practitioners to get involved in Development and Research (D&R) partnerships with universities should be encouraged.
- Knowledge mobilisation activities should be extended from beyond simply communicating research, to considering how it is effectively engaged and applied to practice. A range of brokerage activities, which support interactions between researchers, practitioners and intermediaries, should be funded and evaluated.
Practice
- A concerted effort is needed to build the necessary time, skills and resources within practice to support research use at scale. Examples of activities that would help include:
a) Wider training and ongoing professional development opportunities to equip professionals with the skills to understand, find, share and use research.
b) Recognition for leadership that supports research use within professional settings.
c) Commitment by organisations to collectively use research knowledge to inform practice.
d) Professional networks that can support knowledge mobilisation and share expertise between organisations.
- Professional bodies, such as a proposed College of Teachers, should be empowered to play a coordinating role in supporting evidence–informed practice and setting professional standards, led by practitioners and at arm’s length from government. There should be strong attachments to university departments and opportunities for cross–over between academics and practitioners.
Policy
- Government needs to ensure there is coordination across different elements of evidence ecosystems, including different research databases, programme clearinghouses, dissemination and brokerage activities, as well as capacity building efforts within practice. This is crucial as sectors become increasingly decentralised.
- To address inconsistencies in the implementation of evidence–based approaches (e.g. restorative justice, formative assessment), as much effort at the policy level needs to be placed on how the evidence is applied as on what the evidence says. Enterprises such as the Education Endowment Foundation should be expanded and replicated to ensure a regular throughput of proven innovations to help get the evidence working in practice.
For further details, please contact Dr Jonathan Sharples
Manager of Partnerships, Institute for Effective Education, University of York
Email – jonathan.sharples@york.ac.uk
You can sign up for the free Teacher Development Trust newsletter for more articles like this using the form below:
[salesforce form=”2″]