‘Pre-discovery’ has existed within digital teams for years. I first heard of ‘pre-discovery’ back in 2016 and didn’t understand what it meant. The following year I ran a review and retrospective of discovery across digital teams in government and found that no one else really understood what it meant either.
In 2019 I worked with colleagues across MOJ Digital and Technology to figure out if we needed something before discovery. If so, what did it look like, what should we call it, and what does success look like?
We did need something and we decided that it should be called ‘exploration’, not ‘pre-discovery’. We designed our approach based on several experiences of ‘pre-discovery’ and have now tested it with two explorations. Here’s what it looks like today.
Definition of exploration
Exploration provides space for large-scale or wide-ranging research and analysis. It provides space to identify multiple opportunities to improve the value of our services, or the ways we work. This was previously known as 'pre-discovery'. We no longer call it ‘pre-discovery’ because this name presupposes that all opportunities are suitable for discovery. Our experience is that opportunities may also be projects, organisational improvements, and other-shapes of work. We should keep our options open at this early stage.
Exploration has a broad scope. It explores things like a whole system, policy area, or programme space. It identifies multiple opportunities and analyses these opportunities in order to prioritise the most valuable, useful and feasible. This is in contrast to discovery, which is relatively narrow in focus and explores a single opportunity.
- is suitable for wide-ranging research covering a large space that will generate multiple opportunities
- is action-oriented - it is the minimum research and analysis required to highlight the opportunities that are the most valuable, useful, and feasible for MoJ D&T or our internal clients. It is not comprehensive, academic research
- has clear investment - the amount we invest in exploration should be worked-out based on (i) the potential value of improving the focus of exploration, (ii) the confidence we have that we can take action based on the opportunities we identify, and (iii) the value of this exploration relative to other exploration going on within MoJ D&T
- identifies clear opportunities and how to capture them
- looks for opportunities in a broad space - it's valuable for us to be consistent in how we capture, analyse and share opportunities.
Opportunities will usually be:
- changes to existing services which can be handed over to existing product teams or live support
- discovery for a new or rebuilt product
- project-work to improve how we work
We recommend using an ‘opportunity template’ to summarise findings. This makes it easier to compare opportunities and share recommendations with Triage and other decision makers.
Capture the critical information to understand the extent and importance of the problem. Tying the background to the goal statement reduces waste by limiting opportunities to focus on the wrong areas.
- Current condition and problems statement
This is what the business stakeholder wants to address, in simple understandable terms and not as a lack-of-solution statement. For example, avoid statements like ‘Our problem is we need a Case Management System.’
- Goal statement
How will we know that our efforts were successful at the end of implementation? Ideally we will need one metric for success. For example, ‘Our goal is to reduce system failures compared to the previous test results of 22 major issues; our target is to reduce this by 20%.’
- Root-cause analysis
Detail the hypothesis and assumptions or a set of experiments performed to test for cause and effect.
- Opportunity analysis
List the steps of an experiment to test the opportunity and define the pass/fail criteria for testing the opportunity.
- Follow-up actions and report
Identify further steps and share what you have learned with your team or organisation.
- We recommend that each opportunity is no longer than 2-sides in length.
- This opportunity analysis section is particularly important when concluding exploration. It will help you:
- decide which of the opportunities are valuable enough to proceed with
- prioritise between opportunities
- clearly communicate the case for further investment.
- identifies the assumptions they wanted to test; their constraints and dependencies; their approach to testing their assumptions; and the proposed output of their exploration
- reviews this overview with (i) Isabel Santafe, MoJ D&T's Head of Research and (ii) Nick Glover, Internal Controls Manager for feedback on whether the approach was sound, and the return on investment was clear (1-hour kick-off review)
- has ad-hoc, informal contact between the research team and Head of Research and Head of Controls during the work (as needed)
- holds a 1-hour review of the research once complete, with the research team demoing their insights and showing how they matched the original aims.
Who's in an exploration team?
Team’s are empowered to figure out who they need. What follows is simply guidance based on review and retrospection of similar activity.
We assume that:
- exploration is research and analysis, so is lead by these professions, i.e. user research, business analysis, and service design
- this type of work functions best with a core ‘team’ of 2-3 people - the core team should be the fewest people needed to test the most assumptions possible
- this doesn’t discount working with additional collaborators on an adhoc basis if needed. For example, you may want a delivery manager to help build your team charter and establish your workflow. Ad-hoc consultancy from a product manager to help define and prioritise opportunities. Or technical consultancy from engineering when reviewing software systems.
There you have it, that’s how we define and review exploration at the MoJ. Do you use something like exploration in your organisation? If so, how do you define and review it? If not, what approach do you take?