Service assessments: no longer Dragon’s Den

one year ago 7

By Hannah Beresford For most of us who have worked on a digital service for the public sector, the mention of a ‘service assessment’ likely fills us with dread. Thoughts of a Dragon’s Den-style panel can quickly send product teams into panic mode, as they frantically start preparing materials, often several months in advance, even when the bulk of the work is yet to be done. Photo by Elisa Ventur on UnsplashFor the lucky ones who have yet to experience a service assessment, let’s quickly go over what it involves. In 2011, the Government Digital Service (GDS) introduced service assessments as a way of ensuring product teams develop good quality digital services. For a rigorous approach, they developed a ‘service standard’ with key criteria to be met in the core product development areas: research; design; technology; service performance and agile working. This comprehensive evaluation makes perfect sense due to the critical nature of these services and their use by a wide cross-section of the public. Not being able to easily apply for a visa or register for childcare benefits could have serious consequences for individuals across the UK. However, at Deloitte Digital (DD) — where we work with clients to build customer-centric organisations and work with a wide range of public sector clients — we felt that the assessment process had become fraught with anxiety and apprehension, and unnecessarily so. Whilst product teams recognised the need for rigour, a ‘not met’ could mean significant additional work, which always came at a cost. We wanted to change this view, so we set about developing a training course and helpful materials to put our fellow colleagues at ease. We were on a mission to dispel fears, boost confidence and equip teams with straightforward materials for smooth preparation (for those in all ‘Product’-related roles: Business Analysts; User Researchers; Content Designers… you get the picture). Doing our research Having prepared for assessments in the past ourselves, we (a User Researcher, Service Designer and Delivery Lead) reflected on what we had found helpful. In reality, there wasn’t much guidance out there. Only a few vague pointers about the importance of showing you had ‘Understood your users’ or ‘Solved a whole problem for users’. But what did this really mean? What were our assessors actually looking for? Delving a bit deeper, we came across an important piece in the jigsaw: previous assessment reports. These documents are published online after each assessment, provide detailed feedback for each of the key criteria, including what teams had done well and any gaps identified. Whilst a treasure trove of insights, each report needed to be manually trawled — a big job indeed! Never shy of a challenge, the team set about combing all the reports from the past 2 years (34 in total!), coding comments as ‘positives’ or ‘watch outs’ and clustering these into themes. Full of clues into what assessors were looking for and common reasons for success or failure, these themes ended up forming the backbone of our top tips training. Example of our post it notes on Miro for analysing previous assessment reportsThe logistical bits To flesh out our tips, we were lucky in DD to be able to draw on colleagues’ first-hand experience (collectively, we have recently passed over 10 assessments). In our interviews, we found that our colleagues had a lot to say about navigating the assessment process more broadly, so we distilled some info and tips for each stage: pre-assessment, the day itself and post-assessment. This included: Pre-assessment Follow best practice from day one. Consider the assessment criteria and expectations before you dive in. Communicate with your assessors early on. Make them aware of your service and the progress you’re already making and get an assessment date in the diary well in advance. Keep a RAG log (red/amber/green) to stay on track and avoid a mad rush at the end. Assign clear responsibilities and timeframes aligned to each service area. The day itself The assessment is four hours long (with breaks!) The team starts with an overview of their service and users/user research before a Q&A from the panel on the remaining areas. The panel consists of a Lead Assessor and a designated assessor for User Research, Design and/or Content, Technology, and Service Performance. You should also bring a representative for each of these areas, as well as a Product and/or Service Owner and Delivery Lead (see diagram below). Make sure you have the people with all the knowledge in the room. You can bring more people, but you’ll need to agree this in advance. Post-assessment Relax! There’s nothing more you can do. Take a breather. Wait for your result (usually takes 3–5 days). If it’s a ‘not met’, they’ll be some action points to address before returning for a re-assessment (don’t worry, this isn’t common!). The information sheet we created detailing who attends the assessmentThe information sheet we created detailing what happens on the day of the assessmentTop tips for each service area Our research gave us a whole load of insights to work with. We needed to make this easy to digest, so we created a framework aligned to the core assessment areas and dropped our tips into the relevant buckets. We later recognised this framework could, in fact, be used for presenting material; simply lifting and shifting this onto Miro and adding relevant evidence would help teams structure their thinking and preparation and easily tick the boxes. The service assessment framework we created to help teams easily prepare for assessmentsThere isn’t the bandwidth for all our tips in this blog, but I’ll share a few key pointers: Problem and service vision: Own the assessment A ‘not met’ outcome (the assessors prefer not to use the word fail!) often results from your team not taking the reins. Own the assessment: state upfront what is and isn’t in scope, the primary user need(s) and top pain points you are tackling. Also think about how to easily navigate your materials. Don’t put that hard work to waste and dazzle the panel by seamlessly signposting to the relevant evidence. Miro is a good medium for this; simply lifting and shifting our assessment framework should do the trick. Research and design: Clearly articulate your design and delivery process You’ll likely already know what best practice design looks like. Nevertheless, you’ll need to be explicit about your well-oiled process to ground design decisions in robust research. Avoid reeling off the reams of insights you uncovered and, instead, have a few dedicated examples up your sleeve, to illustrate your tried and tested approach. Our team: Democratise your ways of working The assessors want to see that you are working in the open and drawing on input from the whole team. Shout about what you’re doing and do this often! Invite all the team to observe the research sessions so they can hear from your users first-hand. Plus, this way, they’ll be much more inclined to put those insights into practice. Simply replicate your agile design process for how you work as a team: continuously improve your ways of working based on team members’ feedback and demonstrate how you did this. A photo from our first training session at Deloitte DigitalImpact At the end of the day, participants left the training with three important things. One, a deeper understanding of service assessments and greater confidence preparing for them. Two, a pdf document jam-packed with templates, tips and tricks. And three (arguably most important), STASH: ‘service assessment ready’ stickers to show off to their colleagues. Since the first training session, we have noticed a tangible impact. The framework has directly contributed to several successful assessments for our public sector clients and the Miro layout has been used to replace unwieldy presentations. We’ve also started offering mock assessment practice and one-to-one feedback sessions on assessment materials to provide a steady source of support. In 2023, we’re hoping to get to the end of the year having dispelled fears of a Dragon’s Den panel and with a DD office full of laptops, proudly brandishing our ‘service assessment ready’ stickers. Our ‘service assessment ready’ stickers awarded at the end of the trainingBy Hannah Beresford— User Researcher, Deloitte Digital Service assessments: no longer Dragon’s Den was originally published in Deloitte UK Design Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.


View Entire Post

Read Entire Article