EMCDDA analysis of implementing North American drug prevention programmes in Europe

The EMCDDA have been looking at whether prevention programmes that have been developed in North America (USA and Canada) could be delivered to effect in Europe.

The key points the EMCDDA identify are:

  • Cultural infeasibility is often seen as more of a barrier than it should be.  Where adaptations have struggled it may have more to do with low prevalence and high social protection than flaws in the programmes.
  • Adaptation needs care and consultation, but is feasible.
  • In adapting American programmes those doing so should consider culture and context separately.
  • Thinking about the social capital available may help reduce any anticipated resistance to programmes that have been developed elsewhere.
  • Adaptation needs to be careful not to change the key principles that have made the programme effective in the first place.  Changing illustrative examples to make them accessible to participants is fine, but changing the programme protocol (number and order of sessions, etc.) is likely to change the efficacy.
  • A considerable barrier to implementation may be the perceived complexity of the North American programmes – multi-component, multi-sessions, etc.

The paper gives examples of a number of the programmes that have had European trials many that will be familiar to readers of this blog – the Good Behaviour Game, Preventure, Strengthening Families Program, and Communities that Care.

From my point of view this is a very helpful resource with lots of great nuggets for any of us thinking about how we can introduce or replicate evidence based programmes that have been developed elsewhere.

Advertisements

First process evaluation from the Realising Ambition programme

Launched last year the Realising Ambition is a £25 million over five years programme to replicate evidence based crime prevention interventions.

Being led by a consortium of Catch 22, the Social Research Unit, Young Foundation and Substance, it is by far and away the most sustained recent attempt to bring the evidence based programme movement to the UK.

25 projects are being run with to reach 145,000 young people aged 5 to 14 over the five years’ of programme delivery, you can read more about the programmes and programme deliverers here.

A year on from the launch an interim evaluation of the process has been published which points to some of the learning so far. Continue reading

Barriers to Evidence-Based Education

Robert Slavin, director of the Center for Research and Reform in Education at Johns Hopkins University’s school of education, asks us to imagine that we used evidence to guide everything we do in schools and suggests:

educators would constantly look at their own outcomes and benchmark them against those of similar schools elsewhere. In areas that needed improvement, school leaders could easily identify proven, replicable programs. As part of the learning and adoption process, they would attend regional effective-methods fairs, send delegations to visit nearby schools using the programs, and view videos and websites to see what the programs looked like in operation.

He argues that there are there are four barriers to making this happen:

  • Too few rigorous evaluations of promising programs;
  • Inadequate dissemination of evidence of effectiveness;
  • A lack of incentives for localities to implement proven interventions; and
  • Insufficient technical assistance for implementing evidence-based interventions with fidelity.

Slavin is writing about the whole curriculum and indeed whole school interventions, but this seems to apply just as well to drug prevention.

He says that central government have a significant role in overcoming these barriers, not by determining which programmes to use but by:

  • Helping schools get better intelligence on proven programmes and persuading them that deploying them will lead to better outcomes.
  • Incentivising the take up of evidence based programmes through grant funding.
  • Supporting a variety organisations who can help local policy makers and school leaders learn about proven programmes.
  • Supporting organisations that can support the effective delivery of the programmes that schools choose to implement.

Read the whole article at Education Week: Overcoming Four Barriers to Evidence-Based Education.

RisKit Programme – a multi-component programme for the reduction of risk behaviours in vulnerable adolescents.

Alex Stevens and colleagues from the School of Social Policy, Sociology and Social Research at the University of Kent have been working with KCA and Kent County Council to develop a programme that reduces risky behaviours.

They report that six months after delivering the programme, which was carried out with 226 participants who had been screened for vulnerability they were able to measure:

significant reductions in alcohol use (as measured by percentage days abstinent and drinks per drinking day). There were also reductions in illicit drug (mostly cannabis) use, although these were not statistically significant.

The programme manual has been made available on a Creative Commons license and can be downloaded here.

What We Don’t Know about Evidence-based Programs

Aside

Pertinent thoughts on the limitations of the evidence base we’re working with…

Interestingly, current replications of evidence-based programs place a priority on ensuring that the programs are being implemented with fidelity to the program model. This is done to help improve the chances that the program effects can be replicated in other settings. This is important, but we have missed an important step: If we don’t understand what it is about the program that made it effective in the first place, then it is challenging to replicate the effects that made the program desirable in the first place.

Read the rest on What We Don’t Know about Evidence-based Programs | Trend Lines.

An emerging movement around evidence based education

While we will continue to be baffled about the current liaise faire approach that Ministers have to health education I can’t help admire the speed at which they are trying to advanced the cause of evidence based education.

The latest example is that the DfE have commissioned two new Randomised Controlled Trials of programmes – one for maths and science the other looking at a child protection assessment tool.

Michael Gove says:

We need more hard evidence in the education debate. We also need to develop a better understanding of what counts as effective social work. Randomised controlled trials offer us the opportunity to establish which policies genuinely help children. I am delighted the DfE is embracing a more rigorous approach towards evidence.

The question for me then is whether the prevention field has already got a head start in this field – in that there is a history of running school interventions with RCTs and in producing metareviews of those trials – and whether that might be a way of engaging the Secretary of State in seeing the benefit of the field to educational and health outcomes?

But it isn’t just the Secretary of State for Education who has embraced this agenda it seems to me that there are many enthusiasts amongst teachers and school leaders as well – for example this LinkedIn group has a couple of hundred members, while this conference in September is likely to be oversubscribed many times over.

A final thought, this time from Tim Harford, who writes for the Financial Times and presents for Radio 4, who on his blog, The Undercover Economist, argues that it would be wrong to see this movement as a one way street.  Talking about how research and practice are not mutually exclusive he argues:

In short, evidence-based practice in medicine isn’t a case of doctors, brainwashed into believing whatever clinical trials tell them, passively awaiting instructions. It’s a two-way street, where some of the best ideas for research are suggested by practitioners, and best practice spreads sideways from clinician to clinician rather than being handed down by diktat…
One can see why Dr Goldacre calls this a “prize”. Teachers are better placed than anybody to generate new research questions, based on years of observation of subtleties that would escape any educational statistician.

This seems right to me and fits, I think, with the model of programme development that is set out in the EMCDDA’s standards for drug prevention which talks about justifying the need for an intervention, understanding the target population and tailoring it to the needs of that population – all things where research and practice need to be working hand in hand.

School based programmes for smoking prevention

The Cochrane Collaboration have published a review into school based smoking prevention programmes, which updates a review of the evidence base from 2002.

The headline finding is that programmes that combine life skills and a focus on social influence seem to be the most successful, with those trials that were examined showing significant effect at one year and at the longest follow-up point.

Interestingly the review finds that a trials looking at using a social influence model on its own haven’t shown a significant effect, nor have programmes that seek to combine with interventions outside the classroom, or ones that rely on information provision alone. Continue reading

Evaluation of In:tuition

Alcohol Research UK have published the process evaluation and feasibility study of In:tuition, DrinkAware’s life skills programme.

There are some very interesting observations.

On the positive side the teachers and pupils clearly see the programme as being useful and engaging.  The resources and tools were seen as being comprehensive and useful, and the flexibility around the on-line and off-line tools was also appreciated.

More challenging is that while the evaluators found that primary schools were more able to find the time to undertake the 10 lesson programme secondary schools really struggle.

Time for PSHE can be very limited and programmes crowded.  Five of the 15 schools, including the 3 schools that piloted all 11 secondary lessons, did so with a small targeted group of pupils, which offered more flexibility.  Of the 10 schools that piloted secondary lessons within their timetabled PSHE education programme none completed all the lessons and only 3 schools used most of them.

As you might imagine this is something of a challenge for those of us who believe in evidence based programmes – given that those that have shown longer term effects all come in at this sort of length or longer.

Another worry is the thought that there is some evidence that doing bits of programmes may have iatrogenic effects – i.e. lead young people to be more likely to drink or take drugs.

I also see that one of the recommendations from the evaluation is to reduce the length of the programme.  I think this is clearly something that meets the needs of schools, the question will be whether it would impact on the public health outcomes that DrinkAware are hoping the programme might have.

Implementing Evidence Based Programmes

I’ve noted before that while having an evidence based programme is one thing, being able to implement it well is just as critical to getting good outcomes.

The DfE have just published a paper on implementing evidence based programmes in children’s services and it looks to have some interesting key messages that we may want to think about as part of our work.

Here are the key findings they report:

  • Carefully planned and well resourced implementation is critical to achieving better outcomes and programme success.
  • Implementation of an evidence-based programme may be aided by the involvement of an implementation team to plan for the changes that are required at four different stages: exploration and adoption of the programme; installation; early implementation and full operation.
    • Exploration and adoption: Before selection, careful consideration should be given to whether a programme could work in the local context, with existing agencies and available resources.
    • Installation: Planning for successful implementation of an evidence-based programme requires change at the practitioner, supervisory and administrative support levels, as well as the system level (Fixsen et al 2005). There are however no purely administrative decisions – they are all treatment decisions (National Implementation Research Network). Support for these changes has to be resourced both prior to and during implementation.
    • Early implementation: The implementation phase requires ongoing support and fidelity monitoring, as well as evaluation of the new processes being introduced.
      • Maintaining fidelity to the original evidence-based programme has been improved by working with a ‘purveyor’ – individuals or groups who work in a systematic way with local sites to ensure that they adopt a pure and effective model of the programme.
    • Full operation: Over time the programme should become accepted practice, staff become fully competent and procedures become routine. Sustainability of a programme depends on commitment to ongoing funding and continued staff training and monitoring.
  • Examination of the experiences of implementation of four high intensity evidence based programmes in children’s services has shown that it is possible to successfully implement them in a different cultural context. This has been aided by maintaining fidelity to the programme, but allowing some planned adaptations to processes to accommodate different national and local systems. Successful implementation was boosted, for instance, by early concentration on changes in staff working patterns, careful focus on referrals of appropriate clients, and modification of training materials to suit local culture and language needs.

The full report can be downloaded here.