[Originally published on the Australian Government Public Sector Innovation Network under a Creative Commons 3.0 BY AU licence]
The following post details my personal reflections on the Behavioural Exchange 2014 Conference. An overview of what emerged from the conference is provided in a separate post.
In my opinion, behavioural insights offer a lot for public servants. The field helps provide a richer understanding of people, how and why they might react or act, and a more sophisticated set of empirical tools and strategies for assessing the effectiveness of interventions.
It also helps us as individuals understand our own limitations and biases. It potentially offers strategies for ensuring that our organisational processes and systems not only don’t exacerbate or encourage those biases, but possibly help counteract them.
Yet I think there are some things that need to be considered further, particularly around how behavioural insights fits with innovation and design, and what may limit the application of these fields to more complex problems.
Nudging and design
Design thinking and behavioural insights have a number of characteristics in common. Design is also about changing behaviour, whether getting people to use new products or to interact with their environment or services in different ways.
In addition, they share
-
- A bias towards action – both methodologies recognise that the answer(s) often will not be known until you start, so it is best to get results and see, rather than identifying an the answer and then proceeding
-
- Iteration – likewise both emphasise the need for iteration, for trying something then changing it in light of results and feedback, rather than investing resources in perfecting something only to find out that it does not work as hoped
-
- Empathy – both recognise and emphasise the humanity of the people we are dealing with, whether we refer to them as stakeholders, clients, customers, or citizens, and their interests, perspectives, biases and preferences
-
- Immersion and depth – both value methods such as ethnography and field research which provide an in depth understanding of the context and the lived experience of people
-
- Dealing with a dynamic system – both recognise that they are dealing with a dynamic system, where the intervention or nudge can change the context, change behaviour or change the nature of the problem/issue being dealt with.
However the two approaches, as currently practiced, do seem to differ in a number of important ways as well.
-
- Nudge appears to be more incremental in its focus, and looks to be more at the efficiency/refinement end of the innovation spectrum, whereas design is stronger in coming up with new approaches to try and new conceptions or framing of problems
-
- Nudge currently has a much stronger data focus than design. That’s not to say that design does not rely on evidence, but that it is, as an approach, perhaps less effective at quantifying that evidence
-
- Design practice perhaps has a more mature understanding of the importance of the broader ecosystem of interests. This may merely reflect the stage that design is at, rather than anything intrinsic. Both approaches are aided by involving decision makers, those involved in creating a better outcome, and those affected
-
- Design is better at eliciting and understanding underlying motivations and drivers, rather than just behaviours
-
- Because of its current focus on transactional matters, nudging may be at risk of improving poor strategies to an extent that they become sufficiently less bad to be tolerable, and thus accidentally cut off investigation of more promising avenues.
In his presentation, Professor Kees Dorst noted that the possibilities of nudge go much further than fixing the existing order. Nudging could help with the creation of new states of play. He noted that many of the challenges faced now are more open, networked and complex than before. When a conventional organisation is faced with such problems, it will tend to make new rules for itself, which will end up causing the organisation to be more stuck. Professor Dorst advocated that we need to allow more complexity within organisations, but to do so in a way that does not lead to chaos. Perhaps a better integration or mutual support of design and nudging will help with that?
Humility and scope creep
An interesting aspect to the discussions at the conference was the explicit humility about nudging. There was recognition that it had limits, that behavioural insights as a practice would run into issues, and that there is still much to be learnt. It was refreshing to see a conference focussed on a methodological approach be so aware of the limitations.
Yet at the same time there was a certain degree of, what seemed to be, ‘scope creep’. At times behavioural insights seemed to encompass much more than itself, and be suitable for applying to nearly any situation and problem.
There was discussion of the many things that it was good at, and an intent to move beyond the ‘low hanging fruit’ or easier victories, and get into the more substantive issues. I think there was not such an acceptance of the possible barriers that it might run into when that does happen.
In my experience, this is something that can happen with any discipline or methodology (the old “when all you have is a hammer, everything is a nail” situation). It is certainly something I have seen with innovation and design (and economics for that matter). Perhaps this is an inevitable tension for any approach, even ones as cross-disciplinary as these?
Questions
I came away from the conference with a number of questions, some of which had been discussed, that I thought needed further reflection.
-
- Does a focus on ‘what works’ potentially lead us to try to replicate that without fully understanding the context?
-
- Should everything that works be scaled, or is it sometimes better to leave something that works as it is, and try and replicate the conditions that led to it elsewhere?
-
- Does a focus on ‘what works’ significantly reduce the options considered as worth exploring/testing?
-
- In an era of stretched resources, how do public sector agencies find or leverage the skills to do this well? Does the public service have access to the necessary skills to do randomised controlled trials well?
-
- Is there a risk that nudging may make a bad policy/approach more effective so that it is seen as effective, rather than looking at alternate approaches?
-
- Can approaches that work in transactional and service delivery matters be translated to work on complex policy problems?
Nudging, design, and innovation
So how does nudging/behavioural insights and design fit together with innovation?
Though I feel I’m a bit closer to an answer because of the conference, I still feel like I have a way to go before I could answer that question usefully. But here’s my attempt…
To me design offers a structured but flexible and iterative process for understanding problems in a way that identifies new options and solutions for exploration and testing. If you need new ideas and new strategies for dealing with a problem or a need, then design is a great way to go.
To me nudging and behavioural insights offers an approach that can give a much greater degree of empiricism and methodological rigour to the work of the public sector. It offers a much fuller understanding of how people act, if not why. If you want to improve transactions or service processes by choosing from a select range of options, then nudging can offer a lot.
To me both approaches could learn from and build off each other. Both have a richer understanding of people than perhaps offered by more traditional models. Both value understanding the context more than many traditional processes in the public sector might. Both are more flexible and guided by experience and evidence than more traditional, linear models.
I think nudging could be strengthened by a greater understanding of the power of motivations and drivers in shaping behaviour, in addition to psychological/neuroscience findings.
I think design could be enhanced by better picking up on the empirical methods and data collection offered by behavioural insights.
I think both approaches can contribute to innovation, but that design is much better placed for more radical innovation, whereas nudging better fits with incremental/efficiency type innovation.
But I’d be grateful for any more eloquent distinctions or connections between the three – so if you have one, please don’t be shy!