«EASY ATTRACTIVE TIMELY SOCIAL EAST Four simple ways to apply behavioural insights Owain Service, Michael Hallsworth, David Halpern, Felicity Algate, Rory ...»
One solution to this problem is to help people make concrete, specific plans.
One way this has been demonstrated is by simply asking people to write down their plan. For example, encouraging employees due for vaccinations to write down the time and the date of the appointment increased vaccination rates by 4.2 percentage points.50 Similar effects are seen by prompting people to write down (on a sticky note provided) the date of a colonoscopy appointment and the name of the doctor who is to carry it out.51 An even better approach is to identify any barriers you are likely to encounter, and then plan how to overcome them.52 For example, if the goal is to lose weight, one may identify the temptation of the canteen desserts as a barrier. A simple (but specific) plan might then be: ‘When in the canteen, I will always go to the checkout next to the pieces of fruit’.53
This ‘implementation intentions’ approach is successful because it is timely:
it recognises the power of the situation to lead us astray from our goals.
Advance planning helps people respond ‘in the moment’ in a way that moves them closer to their goal, rather than away from it.
To implement these ideas, policy makers could: identify points when people are likely to set particular goals; highlight common barriers to achieving them; and show the plans others have used to overcome these barriers. This approach will be particularly effective for goals which require repeated actions to achieve a future payoff, like saving and eating healthily.
5. Applying behavioural insights The Behavioural Insights Team has developed a methodology that draws on experience of developing major strategies for the UK Government, a rich understanding of the behavioural literature, and a rigorous application of tools for testing ‘what works’.
The application of the EAST framework is at the heart of this methodology. But the EAST framework cannot be applied in isolation from a good understanding of the nature and context of the problem. Therefore, we have developed a fuller method for implementing BIT projects.
Our approach starts with establishing clarity of purpose (Step 1: Define the outcome).
We then look to develop a rich understanding of the experience of service users and providers (Step 2: Understand the context). Only then is it possible to consider how insights from the behavioural sciences, as set out in the EAST framework, might apply to the problem at hand (Step 3: Design the intervention). The final step is to test the intervention, so that we can understand its impact - and adapt future approaches accordingly (Step 4: Test, learn, adapt).
Behavioural Pitfall 3: Why ‘behaviour’ is different from intentions, beliefs, or attitudes It is important to recognise that changing behaviour is different from changing people’s intentions, beliefs, or attitudes. These factors will often shape our behaviours, but not necessarily directly and in ways that we might expect.
When asked to report their past behaviour, people often make errors.
The 2008 Physical Activity and Fitness Survey both asked people how much exercise they did, and measured (with accelerometers) how much they actually did. As the chart below shows, there was a considerable gap between the two: (continued on next page).54
Behavioural Pitfall 3: Why ‘behaviour’ is different from intentions, beliefs, or attitudes (continued) Percentage of people meeting minimum recommendations of physical activity At the same time, people often state an intention to do something that they do not follow through on (the ‘intention-behaviour gap’). A large proportion of people who respond positively when asked if they intend to exercise fail to actually do so.55 Similarly, individuals often do not fulfil their predictions about how they will react. One report, for example, found that individuals thought that they would not be affected by social norm messages in letters from HMRC.56 But when the Behavioural Insights Team tested similar messages, they found large, significant increases in response rates.57 In all kinds of areas of interest to public policy makers (for example race relations58 and climate change59) we find a similar distinction between what people say they believe and how they act in practice.
What is important, then, is to focus on, and objectively test, what affects people’s behaviours, and to be cautious about using associated intentions or attitudes as a measure of success.
The first step is to be really clear about the outcome of the project or policy.
Wherever possible, this should be a quantifiable change in behaviour. A clear outcome is essential to create an effective intervention because it drives the
design of the intervention. Questions to consider include:
· What is the key metric that would demonstrate success? Can you use routinely collected data? (In our work on employment we used the existing performance indicator that Jobcentre Plus already uses — the off-flow rates from benefits).
· How large an improvement would be needed to justify doing the project in the first place? A lot of this will depend upon a basic understanding of costs and benefits (e.g. a zero-cost intervention will justify a smaller improvement), but also what sample size is required to demonstrate the effectiveness of the intervention.
· During what time period would we hope to see the improvement? Knowing whether it is reasonable to see an instant change or a longer term response will help define what kind of project you eventually run.
The important thing is to ask constantly: what change are we trying to achieve?
Step 2: Understand the context
The second component is to understand the system from the user’s and provider’s perspective. All too often, decisions are made in the absence of a good understanding of how the service in question is used or administered.
This is a problem, for three main reasons. First, we know that context matters — apparently small details can have a large effect on behaviour. Looking carefully at the context can identify excellent insights into what points of a process are affecting behaviour.
Second, we need to ensure that any new intervention does not place an unsustainable burden on the people providing a service.
Third, the people involved in a service (both providers and users) often have valuable insights in their own right. People may mis-predict how a hypothetical change might affect their behaviour, but they will often see problems and opportunities that are hard to detect from central government — especially in contexts that they are very familiar with.
A BIT project will always include a phase of visiting the situations and people involved in the behaviour. This methodology draws from social anthropology (in particular, the field work undertaken by ethnographers) and the ‘design thinking’ model, whose exponents seek to understand the context of a problem prior to designing solutions.60 What this means in practice will depend on the nature of the system and the
outcome set in Step 1. But here are a few examples:
· A BIT team accompanied bailiffs on 6am home visits and spent time with Fine Support Officers in the Courts Service to understand better the fine enforcement system in place. The team observed that final reminder letters were often going unread. This led to a series of trials with the Courts Service which showed that sending a simple text message doubled the size of payments made, compared to just issuing a letter.
· A BIT team spent several weeks in Jobcentres, observing interviews between job seekers and job advisors and understanding better how decisions are taken within Jobcentres. Amongst other things they found that job seekers were required to sign nine different forms in their first half an hour interview.
One of the interventions that BIT developed involved reducing the numbers of forms from nine to two, so that the job seeking conversation can begin straight away.
· Before making the case for new kinds of energy switching services as part of the Consumer Empowerment Strategy, members of the team tested a range of existing services, in order to understand better why only 17% of people switch every year. Several members of the team switched their supply in the process (eventually).
The key point in this process is not to jump to policy conclusions, but to understand better what the key constraints and opportunities might be within the existing system. An open mind is essential during this phase of the project.
Indeed, it is usually best if the intervention has been ‘co-designed’ with the organisations that have deep experience of the area.
The EAST framework comes in at Step 3, where the intervention or policy is created. EAST can be used to structure ideas generation (“How could we make this action as easy as possible?”), or it can be used to conduct a quicker ‘sense check’ of a policy from a behavioural perspective (“Have we made this action as easy as possible?”). This stage can make use of both EAST and MINDSPACE. While MINDSPACE is a summary of a range of behavioural effects, EAST provides a short set of action-orientated principles for busy policy makers.
The first three steps described above – until you start to test and trial your intervention (Step 4) – are not strictly linear. They can have many iterations and feedback loops. The experience of building the intervention may lead you to reconsider the feasibility of the outcome you defined, for example. New information may change how the costs and the benefits stack up. The key is to maintain a balance between being strategic and opportunistic.
Step 4: Test, learn, adapt
Finally, most BIT interventions will use a ‘randomised controlled trial’ (RCT) to test whether the intervention is having the intended effect. BIT considers RCTs to be an essential part of its methodology. RCTs are what enable the team to show the efficacy of the behavioural insights that they apply to public policy, so that we can estimate wider impacts and cost effectiveness; test acceptability;
and adapt future interventions accordingly.
What makes RCTs different from other types of evaluation is the introduction of a randomly assigned control group, which enables you to compare the effectiveness of a new intervention against what would have happened if you had changed nothing (or used an alternative method). A control group eliminates many problems of comparison that normally complicate the evaluation process.
For example, if you introduce a new ‘back to work’ scheme, how will you know whether those receiving the extra support would have found a job anyway?
In the fictitious example below in Figure 1, we can see that those who received the back to work intervention (‘INTERVENTION’) were much more likely to find a job than those who did not. Because we have a control group, we know that it is the intervention that achieves the effect and not some other factor (such as generally improving economic conditions). With the right academic and policy support, RCTs can be much cheaper and simpler to put in place than is often supposed (and quicker to demonstrate results).
By enabling us to demonstrate just how well a policy is working, RCTs can save money in the long term — they are a powerful tool to help policy makers and practitioners decide which of several policies is the most cost-effective, and also which interventions are not as effective as might have been supposed.
Doing this is especially important in times of shrinking public sector budgets. For more details on running RCTs, see our report ‘Test, Learn, Adapt’.
This method — applying policy interventions informed by the growing body of behavioural research and a sophisticated understanding of the situation in the field, combined with rigorous testing and trialling — are the hallmarks of the Behavioural Insights Team’s methodology. We think that they should become more routine aspects of a policy maker’s toolkit.
Behavioural Pitfall 4: Assuming which interventions will work As EAST shows, there are certain general principles that reliably influence behaviour. But behaviour is complex, and we know that context matters greatly. Therefore, we can never be entirely certain that a particular intervention is going to work — even if there are good reasons to think it will.
Box 3.3 sets out the results of the recent trial to increase organ donation, run by the Behavioural Insights Team, the Department of Health, and the Government Digital Service. One of the successful messages was a social norm — pointing out that every day thousands of people who saw this page signed up. As the graph below shows, the message significantly increased registrations.
Two other options included an image as well as this message: one had the message plus the NHS organ donation logo; the other had the message plus a picture of a group of people. There are good reasons to think that adding these images would reinforce the message and increase its impact.
Indeed, when we presented the eight messages to 55 health experts, 22% thought the norm and picture would be most effective; 13% thought the norm and logo would be; and no one voted for the norm message alone.
Although the social norm message increased registration rates, the logo had no additional impact, while including the picture actually performed worse than no message at all – wiping out all the effects of the social norm. We would not have known about this unexpected negative effect if we had not tested it.
Percentage of people registering as organ donors, by variant of message
6. Conclusion UK policy makers increasingly apply behavioural insights. The Behavioural Insights Team has now worked on most areas of domestic public policy, and many departments have developed their own expertise in this field. The Civil Service Reform Plan states that all policy makers should be able to apply behavioural insights, at least at a basic level, and these approaches are now integrated into training and development provision.61 In addition, many other countries are now applying behavioural insights to their own policy issues.
We have created the EAST framework to help respond to this growing interest and to help develop policies that improve the situation of individuals and society in general. The EAST framework is explicitly designed for applying insights in practice, and should be used alongside the existing MINDSPACE and Test, Learn, Adapt resources. However, given the complexity of behaviour, we also urge policy makers to get the advice of experts and academics on what has the best chance of success – and how success can best be measured.
Looking to the future, we think that applying behavioural insights should
increasingly deal with three main issues:
· Replication. We should not assume that the first result we obtain will necessarily hold true in the future (although we have found that this is often the case). Therefore, there is a need to assess whether an intervention produces similar results in different settings.
· Segmentation. Not everyone reacts the same way to an intervention.
The results given in this paper are headline figures showing average effects.
As data capabilities improve, it will become increasingly easy to understand how different groups react. Tailoring interventions for maximum effect adds another dimension for performance improvement.
· Complexity. BIT initially focused on relatively simple ‘one-off’ behaviours, mainly because they were easiest to measure. However, as our work in Job Centres shows, we are now developing more complex interventions to address more complex behaviours.