Skip to main content

This blog post was published under the 2015-2024 Conservative Administration

https://mojdigital.blog.gov.uk/2020/01/23/discovery-at-the-ministry-of-justice/

Discovery at the Ministry of Justice

Posted by: , Posted on: - Categories: collaboration, Discovery, process

Hello, I’m Scott and in this post I’m going to explain how:

  • I started using discovery
  • we improved discovery at the MoJ
  • we define and review discovery at the MoJ. 

You might know me as Head of Product but I also lead Service Standards here at MoJ. Several departments have asked me to share our approach to discovery and discovery review. I’ve been promising to do this for months and finally got around to it. Happy New Year!

How I started using discovery

I used to work with nonprofits, charities, and social enterprises. We had to pitch for funding from investors. Most investors wanted a well-defined solution to invest in, at a time when we had the least confidence in the problem we were solving. This could lead to a two year project to build a solution to a problem that didn’t really exist, or wasn’t valuable to solve for users. I wanted an efficient way of defining a problem before looking for solutions.

I was introduced to the Design Thinking Bootleg from the Institute of Design at Stanford in 2010, then dug a little deeper and found the Design Council’s ‘double-diamond’ framework from 2005, and The Four Steps to Epiphany by Steven Blank, also from 2005. I’d been using these for a year or so when The Lean Startup by Eric Ries was published in 2011. Eric explicitly pulled together these influences into a single, combined approach to discovery. The following year I saw GDS Service Manual for the first time and recognised that they had clearly been influenced by The Lean Startup, translating it for socially-driven organisations and helpfully formalising the ‘discovery phase’. UK government’s commitment to discovery is a big reason why I joined the MoJ’s digital team in 2015.

How we improved discovery at the MoJ

In 2017, my colleague Nabeeha and I both led work to improve the MoJ’s use of discovery. We both recognised that discovery is the most valuable stage of any new product, since poor problem definition at this stage can lead to bigger and bigger issues over time. However, there was no formal review of discovery before progressing to the next stage of product development and it was clear that discovery is hard, with limited support and guidance for teams on how to do it well.

Nabeeha (Head of User Research) introduced discovery reviews in 2017. GDS do not require discovery review and until this time, a discovery would be briefly reviewed at MoJ’s weekly, one-hour portfolio review as part of a large agenda. There might be a few minutes to discuss the discovery, before moving onto the next agenda item. Nabeeha designed a one-hour discovery review to make space for us to check that a problem is worth solving before we spend time exploring possible solutions.

At the same time, I ran a review and retrospective of discovery in government in order to see what we could learn. This generated 10 experiments you can try to improve discovery and provided insights to support Nabeeha’s work. 

Nabeeha chaired review of all MoJ discovery from Summer 2017 until Autumn 2018 and shared 6 thoughts from 15 discoveries. I was part of the review panel during this time and took over as Chair in Autumn 2018, folding it into our overall approach to service assessment. 

Since then we’ve had another 10-15 discoveries. I’ve recently paired with Isabel Santafe (interim Head of User Research) to simplify and improve our definition and review of discovery, based on what we’ve learned from 25-30 discoveries over the last 2-3 years. 

How we define and review discovery at the MoJ

The following is copied and pasted from our internal Service Standard Handbook. 

Definition

A product discovery figures out if and why we should develop a product. 

It defines a problem by answering the following discovery questions:

  • Is there a problem?
  • Is it valuable to solve?
  • Is it feasible for us to solve?
  • Is it urgent?
  • Is it pervasive?
  • Are users solving it themselves?
  • Has someone else solved the problem?

If we can answer 'yes' to the first five questions and 'no' to the final two questions then we've probably found a problem worth solving.

Notes:

  • All discoveries are research. But not all research is discovery. We do not need to label all research as 'discovery'. We only call research 'discovery' when we're exploring a single opportunity, trying to figure out if and why we should build a product.
  • GDS provides guidance on discovery, the following adds additional guidance based on what we've learned from our own discoveries since Summer 2017.

Review

Product discoveries are reviewed and the review is a chance for a discovery team to:

  • stop a discovery because now is not the right time to solve the problem
  • ask for more investment in discovery to find out if now is the right time to solve the problem
  • ask for investment in an alpha.

Discovery teams share their decision to stop/extend/progress by answering the discovery questions.

Lasting one hour, the discovery reviews are split into two halves:

  • The first half is a chance for the discovery team to share their answers to the discovery questions.
  • The second half is a chance for the panel to learn how the discovery team came to their conclusions.

Discovery review panels have three perspectives:

  • Product management will explore the value of the problem to be solved
  1. Based on the potential value of solving the problem, what's the most we should invest in developing a solution?
  2. How many users does a successful product require in order to justify the estimated investment requested by the team? This will become the 'target for adoption' and will be used during live assessment in order to measure 'benefits realisation'.
  3. What are the main jobs that a product will help users to do: what is your benchmark for current performance?; what's your team's target for improvement? These figures will become the 'target for improvement' and will be used during live assessment in order to measure 'benefits realisation'. They're often referred to as your 'key performance indicators'.
  4. What team do you need in alpha?
  5. What do you need to learn in alpha to be confident that we should buy, build, or borrow a solution?
  • Research will explore how your team came to its conclusions
  1. Is your evidence impartial?
  2. What's the context of the problem you've explored?
  3. What are the riskiest assumptions and key hypotheses about the problem and how did you test them?
  4. Have you taken the time to understand your users and their context?
  5. Have you engaged with different types of users?
  6. Have you generated sufficiently detailed insights into your users?
  7. What assumptions about solutions are you looking to explore in alpha?
  • Portfolio will ask where your discovery sits in the bigger picture
  1. What investment was agreed for the discovery and how much has the team spent?
  2. Was the investment approved with any additional Spend Control conditions and have these conditions been met?
  3. Does this work remain a priority for your unit and does it appear on your quarterly digital pipeline?
  4. What's the maximum investment required in alpha and has that investment been agreed?

We also have specific questions:

  • What is your approach to GDPR and Accessibility compliance?
  • What is the name of the person who will take ownership of any solution that's developed (often referred to as senior responsible owner)?

There you have it, that’s how we define and review discovery at the MoJ. Sorry it’s taken so long to write this post, I hope it’s helpful. Do you use discovery in your organisation? If so, how do you define and review it? If not, what approach do you take?

Sharing and comments

Share this page

2 comments

  1. Comment by Mark Stanley posted on

  2. Comment by Mohit Bajaj posted on

    Very interesting, relevant and practical methodology.

    Reply

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.