Across government, there’s an understanding that engagement with stakeholders is a good thing. It helps in the successful creation and implementation of policy, especially when it’s done at an early stage. However, this awareness hasn’t yet translated into consistently well-designed online engagement opportunities – though there has been an increase in the number of platforms that would help to make this possible.

So we all want to engage better. And now, more than ever, we can.  We want to start a discussion about how to make this happen. There are lots of things to talk about so we’d like  do this in phases or chunks.

What are we trying to achieve here?

We want to work with people within and outside of government to develop a “matrix” of digital engagement tools. Our aim is to help anyone looking to engage or consult to identify the platforms or tools that can help them do this more effectively. This opening phase of the discussion is a first step to getting there.

We’re pleased to be doing this alongside Demsoc, who are thinking more widely about how to make Open Policy Making a reality. It goes without saying that a large part of that is getting formal consultation and engagement right – though as Anthony says in his post, there’s more to Open Policy Making than just that.

Why does Government consult or engage?

A common characteristic of good engagement or consultation is that it is very “outcome-focused”. So we thought we’d start there too. We’ve identified six “outcomes” that policymakers and service delivery teams might be seeking to achieve when they engage with their stakeholders:

  1. I want to generate ideas: The aim here is to draw on the knowledge and expertise of a wide range of stakeholders to identify creative solutions to existing problems. This works best when no set ideas on a policy or its implementation have been developed, and / or there’s flexibility on the scope and direction of a discussion i.e. you’re happy to see where things take you. Stakeholders should be able to contribute quickly and easily and it can work particularly well when you’re able to incentivise the right people to participate. Examples involving government include Idea Street and Show Us A Better Way. While it is sometimes desirable for ideas generation to be an unstructured process, at other times it is desirable for it to be within a set scope. The Redbridge participatory budgeting exercise is a good example of the latter.
  1. I want to test out new ideas: The aim here is to sound out new ideas or possible policy directions on a representative sample of the target audience. This is a good way to gauge opinion on the current situation. It might also help to define scopes for new policy areas and to identify opinion-leaders and influencers.
  1. I want to create/design a document/service or deliver a project in collaboration with relevant stakeholders (examples: Idea Street, Solutions Exchange): This tends to be delivery-focused and works best when the consultation has the buy-in of delivery or operational teams and there is scope to incorporate respondents’ proposals and ideas. The focus on delivery also has implications for the audience type and size – knowledgeable (often heavy service users as well as practitioners and technical experts) and relatively small. You are most likely to succeed when you have an engaged community and this will take time to build.
  1. I want to draw on dispersed knowledge : The aim here is to gather dispersed information within a given, usually fairly large, population to help build an evidence base for setting policy or for identifying areas to which departmental resources should applied. A good example of this is the Red Tape Challenge. Successfully achieving this outcome often requires a low-effort means for respondents to provide their input and internal resources to analyse the feedback, coordinate action and communicate progress to respondents.
  1. I want to get detailed and focused feedback within a tightly defined framework: This outcome is most likely to be successful when stakeholders have already been engaged at an earlier stage or where there is already broad consensus about the justification or benefits of the policy. The primary stakeholders should be clearly identified and the scope for discussion should be well defined and properly communicated to respondents. This approach does not generally allow for respondents to propose radically different policies or implementation options. Two good examples are the Public Reading Stage consultation on the draft Small Charitable Donations bill and the consultation on the provisions in the draft Care and Support Bill (Note: the term ‘primary stakeholders’ is not synonymous with ‘large institutional stakeholders’).
  1. I want to address misconceptions and clarify objectives through discussion and engagement: The outcome here is to bust myths, persuade or explain the rationale for a policy or its implementation- usually a service. The approach is discursive and responsive. Often times, there is no scope for discussing alternative policy options and this is made clear.

We’d be very surprised if the outcomes we’ve identified above prove to be exhaustive, so this is the first thing we’d like to talk to you about:  (1) Are there any other desired outcomes that you can think of? (2) Are the descriptions and examples we’ve provided for each outcome good enough? (3) Are any of these desired outcomes not appropriate for government digital engagement or consultations?

Next, we’d like to share and seek discussion with you on some successful approaches that have been taking by central and local government to achieve these outcomes. We’ll be presenting these as a series of case-studies and asking your views on what’s relevant Looking forward to the ensuing discussions!

35 thoughts on “The Matrix

  1. Great article.
    For me the issues are that central government has bodies that work on national projects but rarely seem to work with the practitioners within the local organisations. True collaboration is where all levels work to a common goal that is achievable and deliverable but also what the community wants.
    Another concern of mine is the cost of delivering systems. There is a stark difference between the costs of systems in 2008 in relation to Crime Mapping. I showed that not only could local practitioners work together to deliver in timeframe but also at a much reduced cost to the tax payer … http://npwmg.blogspot.co.uk/2011/02/crime-mapping-look-at-beginning-of.html
    There are a group of us within the #lgovsm chat that are currently looking at a similar idea to share ideas, best practice, code etc with the aim of supporting each other and reducing costs to organisations as well as improving customer contact. It would be good to work together on this.
    Cheers
    Sasha Taylor
    Founder: BlueLightCamp, CityCampCoventry

    Reply
    • Hi Sasha. Thanks for your comment. I agree that more needs to be done to involve practitioners working at local authority level. I would argue that the same applies when it comes to practitioners based in government agencies working at national level too.

      Sticking to the questions at hand, do you think the desired outcomes identified in the post are comprehensive? Or do you think there are other outcomes that government works towards or should be working towards? Do you think these desired outcomes would make a material difference to the work of organisations operating at a local level or do you think the desired outcomes are fine, it’s just the implementation that needs to be improved? It would be great to hear your thoughts on these things.

      Reply
  2. Hi,

    Its an interesting article, I guess one of the bigger challenges is opening up the ‘matrix of digital engagement tools’ througout government. I.T. limitations and restrictions man that many channels are not available to all departments and this could lead to disenfranchisement going forward.

    Reply
    • Hi Daren, You’re quite right- any tool we build to assist those who are trying to consult on issues of policy or its implementation, must be accessible. Sir Bob Kerslake has talked about the need to increase this type of digital accessibility for civil servants. I know that people like Emer Coleman (here at GDS) are working on escalating and resolving restrictions that civil servants are facing in this area. I think that being clear about what the desired outcomes for engagement/consultation are will really help to support the arguments for lifting these restrictions.

      I’d really like to hear your thoughts on whether the 6 outcomes we’ve identified above are comprehensive enough or if organisations that restrict staff’s access to the tools they need to engage, have different desired outcomes.

      Reply
    • Actually, I don’t have a problem with government using a consultation to play for time because it doesn’t know what to do. In fact, I’d rather Government occasionally admitted its own uncertainty.

      The challenge is to find the right sort of consultation (or better, engagement) for that scenario so the “pause” is used to build a better understanding of public opinion and the options for action.

      Reply
  3. There is the general comment that I always propose in that given any feedback, particularly if adverse, there is largely a failure to respond. So, if a rational objection to policy is made that is contrary to the current controlling ideology it tends to be ignored (not just put on a backburner).

    Any feedback, apart obviously from the slanderous and obscene, deserves some feedback and not just to be conveniently buried! This should be why it can’t be actioned or how it will be considered, if in the future.

    Ignoring contentious policy comments doesn’t solve them it drives them underground to await another opportunity.

    Reply
    • Hi Mick, we absolutely agree with your point about feedback on responses to consultations needing clearly defined responses! Some of our research has shown that unfortunately a summary of responses to a consultation will not always address specific points raised and can lean towards grouping some feedback into themes. Whilst this is perhaps a useful or necessary mechanism for grouping large volumes of responses into themes or issues which reflect similar sentiments it can sometimes make the respondents feel that their individual views have not been considered or potentially, as you phrased it ‘buried’. We will be considering some top level guidelines for best practice at a later stage in this exercise so we hope that debate will focus in detail on precisely how feedback should be managed effectively and how respondents can be clear that their views, however contentious, are acknowledged and taken forward. Many thanks for your post.

      Reply
      • I have to agree that feedback is a crucial element of all consultations. You quote the Redbridge budget consultation as an example – as a Redbridge resident I took part in that consultation but was disappointed by the generalised feedback, which failed to address issues raised in my response. It is interesting that Redbridge is currently consulting on Health & Well Being (http://tinyurl.com/9oa5bvb) but that consultation is being done offline and via more traditional meetings.

        Reply
  4. Hello there,

    I would be interested to hear your thoughts about how to build a model which bridges all user sourced evidence to support effective decisions, whether it is blue sky policy making or changes to existing policies/services. To declare my interest, I am a client side research manager. When I arrived in the civil service from the private sector five years ago I couldn’t get my head round whether consultations were more about a legal requirement to consult and very much less about a kind of co-creation eg here is our solution do you have a problem with it vs here is our problem what ideas do you have for solutions. Having read the red tape challenge responses it is apparent that it is much easier to explain the problem with the way things are and much less easy (and I include myself) to come up with workable and deliverable solutions. I suspect the latter may therefore need more of a moderated approach for which there are a few options.

    So turning to your list, I think 1 to would, out of the public sector, be classic customer research objectives. 5, is consultations territory and 6 is a communications objective. Perhaps there are a couple more. I want to evidence that I have given stakeholders an opportunity to engage, I want to better understand how well (or not) current x is working in order to frame the things that need to change. nb I think we need clarity about the strengths and weakness of crowd sourcing for category 4 ie it tells you that there might be an issue but cannot scale the size of the issue in the general population. We all know that the response curve for satisfaction ratings for online surveys is very different to telephone surveying.

    It may be that appropriate action standards are put in place. The tools will eventually be in place for all to use, we must still make sure the disciplines of a good brief are in place along with recognising the skillsets required. Could a decision matrix be developed to guide towards the best tool/approach for the each presenting issue? I’ve got plenty of scenarios I can offer but don’t want to go on too long! Let me know if I can help.

    Reply
    • Hi Ingrid, I think you’re spot on about the difficulty of determining the scale of an issue identified through a crowd-sourcing exercise. I wonder if we can learn more about how we might do this from organisations such as YouGov? Given your past experience I’d really appreciate your thoughts on good examples we might be able to learn from.

      It would be great to get some of your scenarios. In my next blog post, I’ll be setting out our rough attempt at mapping some digital engagement and consultation approaches to the “desired objectives” described above, and seeking discussion on it. I’ll also be seeking a discussion on the format of the case-studies which outline these digital engagement and consultation approaches; so your help with the scenarios would be very much appreciated.

      I think your point about the purpose of the consultation process is also a really good one. I think the scenarios wherein there is a statutory requirement to consult are a relatively small proportion of the total number of consultations. What’s interesting is that whether or not there’s a statutory requirement to consult, the format of most government consultations remains the same and I feel we look to other means e.g. offline stakeholder meetings with known stakeholder groups to achieve most of the desired outcomes. The effect is that we haven’t really invested much time or effort trying to develop the online engagement as an effective means of achieving these outcomes. I’m hoping the discussion about the matrix and the tool itself will help towards tackling this problem.

      Regarding the two additional outcomes you described, “I want to evidence that I have given stakeholders an opportunity to engage” and “I want to better understand how well (or not) current x is working in order to frame the things that need to change”, could you clarify what policymakers might be trying to achieve through the former? Is this about transparency perhaps? Also don’t you think the latter is covered by outcome 4 already? If not, what additional or different objectives could be achieved by it? I think I’m now in danger of going on too long, so I’ll stop :-) but I’d really like to hear more of your thoughts. Thanks.

      Reply
  5. Hi Lena,

    Guidelines need to cover how the responses should be summarised,much of the feedback won’t be quantitative so not easy always to judge the prevailing view. Also to be clear to respondents what they will be done with their feedback and what decisions will be made. At least then the contract is clear. In reality the resource requirement to respond in the way you have described isn’t going to be available. (none of my comments to cabinet office have received any acknowledgement and they weren’t all crazy ideas!)

    Reply
    • Hi Ingrid 
      thank you for your interesting and helpful post.  I think you are quite right in saying that it’s much easier to identify where the current processes are failing and leaving participants feeling disenfranchised than to point to a solution which will be all things to everyone. At this opening  stage of the debate we are keenly focused an developing consensus so if it follows that a required outcome for respondents is  to feel that their view has been considered and/or taken forward then we want to understand how this should be done. For example, to what extent acknowledgement needs to be personalised and if there’s a strong consensus about what methods are used to achieve “my response was worthwhile and made a difference to the outcome”. It would be good to understand if any of our stated outcomes should involve more tailored responses built into feedback mechanisms and how this is prioritised within resource constraints. We have some initial ideas which should help to refine different stages of the engagement process, clarify the scope and shape the way that participants respond, making any dialogue more meaningful. Clearly there are challenges in balancing large volumes of replies against quality assuring the feedback process but we do think there are opportunities to harness digital tools and at the very least for every reply to be acknowledged even if it’s an automated message. In your particular example with Cabinet Office would this have been helpful or just added to your frustration? We think there are some practical solutions to enable a less linear approach, save time and enrich the interaction for all. Your categorisation of the 6 bullets is useful and suggestions on evaluating energing issues via a decision matrix sounds very sensible. Please keep your thoughts coming and if others are reading this and can support, disagree or make other suggestions, we would love to hear from you too so that we can move on to finalising the aims. Thank you again for sharing your insights.

      Reply
      • Hi Lena,

        Feedback is hugely important to make the people who’ve taken the time to take part feel that their participation has been a fulfilling experience. Automated responses can be good as a holding message but they won’t address the issue.

        But you’re right realistically we’ll never have the resources to go back to everyone. Themed responses also often fall short as they end up being too high level. From my experience though, the reason they are so high level can be down to officials having to cut down on space because they’re still in the mindset of providing a response in PDF form. Lack of granularity due to lack of space is one problem that is easily addressed in the digital world.

        Allowing people to post questions and comments AFTER the decision has been made may go some way to making it a fulfilling experience. It would certainly give people the opportunity to ask, if they don’t feel the response addressed their issue. Again, in my experience, officials often feel that the publication of the response is the end of the discussion, and I’ve known them to feel uneasy about any further engagement on the issue once the decision has been made. Maybe that’s part of the answer?

        But I wonder whether we’re placing too much pressure on government responding to everyone individually? Surely the point of social media is that it enables many-to-many communication. We may find that if we open up the debate in this way, non-government participants will jump in to answer other participants’ contributions, and that will be a fulfilling experience in itself. Government’s role could then be more one of chair of the discussion, or orchestrator.

        Reply
        • Hi Aidan
          Good points well made and it’s good that you’ve highlighted how we’re still mimicking the paper constrained process online when digital offers much more possibilities for linking to relevant data according to user need. It would be interesting to get behind the reasons why some officials are uneasy about prolonging the engagement so that we can explore whether these barriers can be overcome with the right tools. Totally agree about opening up the debate, the earlier the better as the discussion can become almost self regulating and this can save time in the deliberative process. We hope to explore these options later however is there anything you or readers can add to the current desired outcomes? We’d like people to contribute here in the very way you described lest we take silence as approval! Many thanks for your ideas.

          Reply
  6. Hi Ade,

    This is a subject close to my heart, and I think your matrix covers the main objectives.

    I agree with Ingrid that these combine research and comms objectives, but I don’t feel this is an issue as for me they are intertwined. This is something you suggest early on – it’s not just about creating the policy but also about making the environment in which it’s going to be implemented as favourable as possible.

    So I’d like to see outcome 6 broadened to ‘I want to gain buy-in from those who care about the issue by allowing them to see why and how the policy (proposals) reached its current form’. Good policy-making is hard, and decisions aren’t taken lightly. The challenge for a policy-maker is to weigh up all these competing interests and make sense of them.

    The problem with traditional closed consultations is that many of those who take part may fail to appreciate that there are people/groups whose views are diametrically opposed to theirs, but equally valid.

    Providing an open forum for them to engage not only with government but with each other, within the context of the policy issue, may lead to more collaborative decision-making. At the very least it will lead to a better understanding of how hard it is for government to please everyone.

    And it may also provide future generations with an audit trail of what the spirit of the policy was at the time it was developed.

    Reply
    • Your point regarding outcome number 6 is a really good one as I think explaining the rationale and evolution of a policy, dovetails nicely with the aim of “addressing misconceptions and clarifying objectives through discussion and engagement”. So how about:

      “I want to address misconceptions, clarify rationale and objectives and outline evolution of policy and its implementation: The outcome here is to bust myths, persuade or explain the rationale for a policy or its implementation (very often as a government service). The approach is discursive and responsive. Often times, there is no scope for discussing the implementation of alternative policy options but an explanation of how the policy (proposals) reached its current form can be provided. The aim is to gain buy-in from those who care about the issue by allowing them to see why and how the policy was developed, the range of differing stakeholder perspectives as well as their competing agendas which had to be balanced.” Does that sound right?

      By the way, your posts triggered a discussion with one of my colleagues on whether as the use of open forums increases, there will still be a need for official summaries. And if so, how to include and address valid minority views (especially when they’re controversial) while being clear that they are minority views. Hopefully as the discussion develops and we begin to discuss the behaviours and practices that should underpin our implementation of open policy making, we can discuss this too. Thanks for posting, Aidan.

      Reply
  7. Ade, you might get some helpful insight from The consultation Institute…
    http://www.consultationinstitute.org/#/about-us/4562374242

    Also take a look at this from Stratagem…
    http://www.stratagem-ni.com/…/Policy_making_in_the_age_of_austerity_report14. 06.12.pdf

    If you want to know how we are doing what you’re interested in for more than half of UK local government and several central government agencies – take a look at this…

    http://www.theysaidyoudid.com/

    Enjoy :-)
    Regards
    Patrick King.

    Reply
    • Hi Patrick. Thanks for the tip, haven’t read this report yet. One of institute’s associates kindly contacted us at the start of this project and offered to get involved. I remember your comments on the subject of consultation and engagement during one of the Institute for Government’s Better Policymaking events. It would be great to read your thoughts on some of the points of view being presented on the site. In particular, we’re planning a discussion around analysis tools and if I remember correctly that was one of your areas of interest.

      Reply
  8. Ade, I greatly enjoyed Clay Shirky’s recent TED talk on the use of tools like GitHub for open policy making:
    http://www.ted.com/talks/clay_shirky_how_the_internet_will_one_day_transform_government.html

    How would this fit into the matrix (3 – collaboration?). Perhaps one of the effects of this kind of approach is that the community decides what it is interested in changing/discussing, rather than waiting for a prompt from government.

    I think it addresses something that Lena and Aidan referred to -that we tend to look at ways that we can mimic offline processes online, rather than using the innovative capacity implicit in new online tools. Tools that Clay Shirky describes as ‘a new form of arguing’ that are ‘large, distributed and low cost’.

    Reply
    • Hi Christian. I enjoyed the Clay Shirky video too but to be honest with you, I’m not sure how it does fit with the 6 objectives set out here. Then again, I don’t think I expect it to. To my mind, Shirky’s talk focused on the “how” not the “why” or “how”. I think the objectives fall into the category of “how” and the “why”. I’m open to alternative views though.

      Regarding your second point, I agree- we do need to learn to use these relatively new online tools in a way that is in line with the underlying philosophies of the communities from which they’ve emerged. The issue isn’t just about how these tools are used but the extent to which the engagement they generate influences policymaking. How do you see this working?

      By the way, have you read the comments related to Anthony Zacharzewski’s post on the Clay Shirky talk? Think you might find the points being made there interesting.

      Reply
  9. Just had a wee look at the comments and generally agree with the points made. It is probably true that distributed decision making couldn’t change the benefits system overnight, or cope with the many contentious issues embedded within the different policies governing the benefits system. However, isn’t that just a matter of granularity? What I mean by this is perhaps by addressing the smaller digestible issues through a process like distributed decision making you can solve some of the problems from the bottom up, rather than from the top down?

    Not all issues will be appropriate for this way of working, obviously, but again aren’t we applying an established way of working to this new approach, rather than looking at ways in which the process might define the outcome?

    Reply
    • Hey Christian. You’re right not every policy area will be appropriate for this way of working. You could be right, perhaps this way of working might forestall some of the bigger problems. Having said that I can’t see it for things like the benefit system. One reason is that so much of that system is experienced offline, and incorporating feedback from the offline system into the online collaboration process will be difficult. I allow that I could be wrong on this. I think with tools such as Git will become part of the policymaking toolkit but until we start using it it’s difficult to see how applicable it will be to a process for which it was not originally designed. Perhaps what we’ll end up with will be a modified version of it. I like the idea of tools evolving differently in different environments. What do you think?

      Reply
  10. Pingback: “At home” or in the pub? | Open Policymaking

  11. I think this also needs to have something about staying within legal perameters or where I work possibly changing the law to accomodate the new policies. This needs to be at the very start of a policies life, is it within the legal framework already or does the legal framework need to be adapted or changed?

    I think that consultation is the best place for this because it can often be during this stage that legalities not thought of previously come to light when others are involved in the ideas and brain storming sessions.

    Reply
  12. Pingback: Whatever happened to open policy?

  13. Pingback: Whatever happened to open government and open policy? A scorecard

  14. Pingback: Whatever happened to open government and open policy? A scorecard | The Democratic Society

  15. Pingback: Online consultation tools | Inside GOV.UK

  16. Pingback: Guerilla Voice: If consultation isn’t working, where’s the alternative?

  17. Pingback: Whatever happened to open government and open policy? A scorecard « Demsoc Open Policy

  18. Pingback: Building the Matrix: Is this the kind of thing you had in mind? « Demsoc Open Policy

  19. Pingback: Whatever happened to open policy? « dev.demsoc.org

  20. Pingback: Building the Matrix: Is this the kind of thing you had in mind? « dev.demsoc.org

Leave a reply

required

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>